Mar 20 14:51:17 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 14:51:17 crc restorecon[4694]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:17 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 14:51:18 crc restorecon[4694]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 14:51:18 crc restorecon[4694]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 14:51:18 crc kubenswrapper[4764]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 14:51:18 crc kubenswrapper[4764]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 14:51:18 crc kubenswrapper[4764]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 14:51:18 crc kubenswrapper[4764]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 14:51:18 crc kubenswrapper[4764]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 14:51:18 crc kubenswrapper[4764]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.852313 4764 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.862872 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863371 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863423 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863431 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863438 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863444 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863454 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863465 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863471 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863476 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863481 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863486 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863491 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863496 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863504 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863510 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863516 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863525 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863530 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863535 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863540 4764 feature_gate.go:330] unrecognized feature gate: Example Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863546 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863553 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863558 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863565 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863575 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863581 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863586 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863591 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863598 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863603 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863608 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863612 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863617 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863621 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863626 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863630 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863636 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863640 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863645 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863650 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863655 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863660 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863664 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863669 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863675 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863682 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863688 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863693 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863698 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863703 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863709 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863714 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863719 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863724 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863735 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863740 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863745 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863750 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863755 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863763 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863768 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863772 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863778 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863783 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863787 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863794 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863801 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863806 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863811 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.863816 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864037 4764 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864053 4764 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864067 4764 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864076 4764 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864085 4764 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864091 4764 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864102 4764 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864109 4764 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864117 4764 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864123 4764 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864129 4764 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864135 4764 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864141 4764 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864147 4764 flags.go:64] FLAG: --cgroup-root="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864154 4764 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864159 4764 flags.go:64] FLAG: --client-ca-file="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864166 4764 flags.go:64] FLAG: --cloud-config="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864171 4764 flags.go:64] FLAG: --cloud-provider="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864176 4764 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864186 4764 flags.go:64] FLAG: --cluster-domain="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864192 4764 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864198 4764 flags.go:64] FLAG: --config-dir="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864203 4764 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864209 4764 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864217 4764 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864222 4764 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864228 4764 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864234 4764 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864239 4764 flags.go:64] FLAG: --contention-profiling="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864245 4764 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864251 4764 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864257 4764 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864262 4764 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864271 4764 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864276 4764 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864282 4764 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864287 4764 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864292 4764 flags.go:64] FLAG: --enable-server="true" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864298 4764 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864308 4764 flags.go:64] FLAG: --event-burst="100" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864315 4764 flags.go:64] FLAG: --event-qps="50" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864320 4764 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864326 4764 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864331 4764 flags.go:64] FLAG: --eviction-hard="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864338 4764 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864343 4764 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864349 4764 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864355 4764 flags.go:64] FLAG: --eviction-soft="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864361 4764 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864367 4764 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864372 4764 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864400 4764 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864405 4764 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864411 4764 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864416 4764 flags.go:64] FLAG: --feature-gates="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864424 4764 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864430 4764 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864436 4764 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864442 4764 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864448 4764 flags.go:64] FLAG: --healthz-port="10248" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864453 4764 flags.go:64] FLAG: --help="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864459 4764 flags.go:64] FLAG: --hostname-override="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864464 4764 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864470 4764 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864475 4764 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864480 4764 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864485 4764 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864491 4764 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864496 4764 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864501 4764 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864506 4764 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864512 4764 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864519 4764 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864525 4764 flags.go:64] FLAG: --kube-reserved="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864530 4764 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864535 4764 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864540 4764 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864545 4764 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864551 4764 flags.go:64] FLAG: --lock-file="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864556 4764 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864562 4764 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864568 4764 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864578 4764 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864585 4764 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864590 4764 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864595 4764 flags.go:64] FLAG: --logging-format="text" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864600 4764 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864606 4764 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864611 4764 flags.go:64] FLAG: --manifest-url="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864616 4764 flags.go:64] FLAG: --manifest-url-header="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864624 4764 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864629 4764 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864635 4764 flags.go:64] FLAG: --max-pods="110" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864641 4764 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864646 4764 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864651 4764 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864656 4764 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864662 4764 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864666 4764 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864672 4764 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864695 4764 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864701 4764 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864707 4764 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864714 4764 flags.go:64] FLAG: --pod-cidr="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864719 4764 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864728 4764 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864734 4764 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864739 4764 flags.go:64] FLAG: --pods-per-core="0" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864744 4764 flags.go:64] FLAG: --port="10250" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864767 4764 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864772 4764 flags.go:64] FLAG: --provider-id="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864777 4764 flags.go:64] FLAG: --qos-reserved="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864783 4764 flags.go:64] FLAG: --read-only-port="10255" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864789 4764 flags.go:64] FLAG: --register-node="true" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864794 4764 flags.go:64] FLAG: --register-schedulable="true" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864799 4764 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864810 4764 flags.go:64] FLAG: --registry-burst="10" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864815 4764 flags.go:64] FLAG: --registry-qps="5" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864821 4764 flags.go:64] FLAG: --reserved-cpus="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864827 4764 flags.go:64] FLAG: --reserved-memory="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864834 4764 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864840 4764 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864845 4764 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864850 4764 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864855 4764 flags.go:64] FLAG: --runonce="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864860 4764 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864865 4764 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864871 4764 flags.go:64] FLAG: --seccomp-default="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864876 4764 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864881 4764 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864886 4764 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864892 4764 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864898 4764 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864920 4764 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864926 4764 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864931 4764 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864936 4764 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864941 4764 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864947 4764 flags.go:64] FLAG: --system-cgroups="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864977 4764 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864989 4764 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864994 4764 flags.go:64] FLAG: --tls-cert-file="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.864999 4764 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.865005 4764 flags.go:64] FLAG: --tls-min-version="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.865010 4764 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.865023 4764 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.865028 4764 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.865033 4764 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.865039 4764 flags.go:64] FLAG: --v="2" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.865047 4764 flags.go:64] FLAG: --version="false" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.865055 4764 flags.go:64] FLAG: --vmodule="" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.865062 4764 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.865068 4764 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865269 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865275 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865282 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865287 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865292 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865296 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865301 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865305 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865309 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865313 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865317 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865321 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865331 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865336 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865340 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865344 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865348 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865351 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865356 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865360 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865364 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865369 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865372 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865396 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865405 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865410 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865415 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865419 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865424 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865430 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865435 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865439 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865443 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865447 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865451 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865455 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865460 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865464 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865468 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865473 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865477 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865481 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865485 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865495 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865503 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865507 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865512 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865517 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865521 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865525 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865529 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865533 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865537 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865541 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865545 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865551 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865559 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865564 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865570 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865575 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865579 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865584 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865588 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865592 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865596 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865601 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865605 4764 feature_gate.go:330] unrecognized feature gate: Example Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865609 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865613 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865617 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.865621 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.866575 4764 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.881252 4764 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.881316 4764 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881518 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881533 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881542 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881552 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881564 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881574 4764 feature_gate.go:330] unrecognized feature gate: Example Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881584 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881594 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881602 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881611 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881619 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881627 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881635 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881642 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881650 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881658 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881666 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881674 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881682 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881690 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881701 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881711 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881720 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881727 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881735 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881743 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881751 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881759 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881767 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881774 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881783 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881790 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881798 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881809 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881820 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881828 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881837 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881847 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881857 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881866 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881875 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881885 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881894 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881903 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881911 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881920 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881928 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881936 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881944 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881952 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881959 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881967 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881974 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881982 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881990 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.881997 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882005 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882013 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882021 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882029 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882037 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882044 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882053 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882060 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882068 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882075 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882083 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882091 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882099 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882106 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882117 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.882130 4764 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882408 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882425 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882437 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882448 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882457 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882466 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882476 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882483 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882491 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882499 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882507 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882515 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882523 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882530 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882538 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882546 4764 feature_gate.go:330] unrecognized feature gate: Example Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882555 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882563 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882572 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882581 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882591 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882599 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882606 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882614 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882622 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882630 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882638 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882646 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882653 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882661 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882669 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882678 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882685 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882694 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882703 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882711 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882719 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882727 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882735 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882742 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882750 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882758 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882766 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882775 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882785 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882795 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882803 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882812 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882820 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882828 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882836 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882845 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882853 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882863 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882874 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882884 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882892 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882901 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882909 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882917 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882925 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882933 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882942 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882950 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882958 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882966 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882974 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882982 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882990 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.882998 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 14:51:18 crc kubenswrapper[4764]: W0320 14:51:18.883009 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.883022 4764 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.885223 4764 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 14:51:18 crc kubenswrapper[4764]: E0320 14:51:18.894949 4764 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.899263 4764 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.899436 4764 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.901639 4764 server.go:997] "Starting client certificate rotation" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.901675 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.902286 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.928665 4764 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.930728 4764 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 14:51:18 crc kubenswrapper[4764]: E0320 14:51:18.932289 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.946452 4764 log.go:25] "Validated CRI v1 runtime API" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.986235 4764 log.go:25] "Validated CRI v1 image API" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.989093 4764 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.997909 4764 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-14-46-46-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 14:51:18 crc kubenswrapper[4764]: I0320 14:51:18.997961 4764 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.028092 4764 manager.go:217] Machine: {Timestamp:2026-03-20 14:51:19.024413408 +0000 UTC m=+0.640602597 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:692a0227-1a43-4617-b4d8-dc30f2b9fadb BootID:0eabb4fb-5c38-457c-9d72-ffd7b7b559d0 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:04:fd:bf Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:04:fd:bf Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:07:7a:b7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:6b:09:23 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:26:91:4a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:6d:2a:bd Speed:-1 Mtu:1496} {Name:eth10 MacAddress:c2:75:88:4d:a4:dc Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:e2:f8:5f:c5:16:e9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.028555 4764 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.028831 4764 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.029315 4764 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.029680 4764 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.029750 4764 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.031202 4764 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.031242 4764 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.031866 4764 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.031916 4764 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.032855 4764 state_mem.go:36] "Initialized new in-memory state store" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.033013 4764 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.037816 4764 kubelet.go:418] "Attempting to sync node with API server" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.037862 4764 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.037915 4764 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.037939 4764 kubelet.go:324] "Adding apiserver pod source" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.037960 4764 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.042963 4764 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 14:51:19 crc kubenswrapper[4764]: W0320 14:51:19.043574 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.043713 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Mar 20 14:51:19 crc kubenswrapper[4764]: W0320 14:51:19.043692 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.043811 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.044092 4764 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.046925 4764 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.048647 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.048696 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.048712 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.048731 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.048759 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.048779 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.048797 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.048820 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.048845 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.048891 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.048929 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.048943 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.049879 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.050746 4764 server.go:1280] "Started kubelet" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.051526 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.052643 4764 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.052741 4764 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 14:51:19 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.054127 4764 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.055242 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.055330 4764 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.055663 4764 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.055699 4764 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.055777 4764 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.056431 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="200ms" Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.056469 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.056652 4764 server.go:460] "Adding debug handlers to kubelet server" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.060711 4764 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.060793 4764 factory.go:55] Registering systemd factory Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.060846 4764 factory.go:221] Registration of the systemd container factory successfully Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.061792 4764 factory.go:153] Registering CRI-O factory Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.061861 4764 factory.go:221] Registration of the crio container factory successfully Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.061922 4764 factory.go:103] Registering Raw factory Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.061970 4764 manager.go:1196] Started watching for new ooms in manager Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.061700 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.64:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e9439c3ddc210 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.050691088 +0000 UTC m=+0.666880257,LastTimestamp:2026-03-20 14:51:19.050691088 +0000 UTC m=+0.666880257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:19 crc kubenswrapper[4764]: W0320 14:51:19.064143 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.064296 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.068463 4764 manager.go:319] Starting recovery of all containers Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.075196 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.075357 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.075457 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.075546 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.075619 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.075677 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.075736 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.075817 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.075889 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.075948 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076016 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076083 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076151 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076260 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076323 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076448 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076532 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076602 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076687 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076750 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076837 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076896 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.076969 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077094 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077189 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077253 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077314 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077371 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077459 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077517 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077576 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077633 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077688 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077750 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077809 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077874 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077931 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.077988 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078047 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078117 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078210 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078269 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078328 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078399 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078475 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078534 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078590 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078647 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078705 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078771 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078830 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078891 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.078956 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079015 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079078 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079149 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079219 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079278 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079334 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079412 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079489 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079548 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079606 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079662 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079719 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079782 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079840 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079906 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.079963 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080019 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080081 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080140 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080202 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080267 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080331 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080472 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080565 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080628 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080689 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080747 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080804 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080870 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080931 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.080993 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081055 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081113 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081175 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081241 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081306 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081406 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081473 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081531 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081599 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081660 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081718 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081776 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081834 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081895 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.081951 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082007 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082062 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082127 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082195 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082255 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082317 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082387 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082446 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082513 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082574 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082631 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082690 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082746 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082800 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082864 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082921 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.082980 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083038 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083094 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083156 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083214 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083267 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083320 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083388 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083459 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083534 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083604 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083664 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083721 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083774 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083840 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083896 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.083959 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084020 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084082 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084140 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084202 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084257 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084311 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084366 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084474 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084556 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084614 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084671 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084738 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084817 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084899 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.084980 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.085088 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.085174 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.085252 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.085336 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.085438 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.085516 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.085595 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.085664 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.087644 4764 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.087771 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.087853 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.087933 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.088001 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.088088 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.088162 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.088238 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.088316 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.088440 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.088544 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.088631 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.088710 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.088784 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.088867 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.088941 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089022 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089114 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089186 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089260 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089339 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089441 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089521 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089594 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089664 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089739 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089809 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089888 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.089963 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.090031 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.090106 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.090184 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.090261 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.090336 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.090423 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.090498 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.090573 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.090771 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.090853 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.090923 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091003 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091075 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091161 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091230 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091301 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091374 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091463 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091532 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091614 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091693 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091767 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091844 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091913 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.091987 4764 reconstruct.go:97] "Volume reconstruction finished" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.092054 4764 reconciler.go:26] "Reconciler: start to sync state" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.108658 4764 manager.go:324] Recovery completed Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.121606 4764 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.124872 4764 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.124920 4764 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.124956 4764 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.125014 4764 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.125756 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: W0320 14:51:19.126240 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.126296 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.127169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.127235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.127254 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.127964 4764 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.127981 4764 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.127997 4764 state_mem.go:36] "Initialized new in-memory state store" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.146948 4764 policy_none.go:49] "None policy: Start" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.148333 4764 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.148441 4764 state_mem.go:35] "Initializing new in-memory state store" Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.156530 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.225844 4764 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.226356 4764 manager.go:334] "Starting Device Plugin manager" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.226465 4764 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.226486 4764 server.go:79] "Starting device plugin registration server" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.227353 4764 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.227417 4764 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.227685 4764 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.227836 4764 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.227856 4764 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.241691 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.257548 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="400ms" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.327746 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.329975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.330033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.330051 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.330097 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.330929 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.64:6443: connect: connection refused" node="crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.426088 4764 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.426268 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.428778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.428870 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.428890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.429267 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.429448 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.429539 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.431521 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.431574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.431593 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.431521 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.431678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.431724 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.432001 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.432337 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.432478 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.433767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.433819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.433841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.433775 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.433995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.434015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.434215 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.434358 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.434461 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.435716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.435751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.435763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.435776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.435781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.435841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.436112 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.436263 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.436333 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.437399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.437447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.437470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.437828 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.437878 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.437901 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.437938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.437962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.439094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.439139 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.439180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.498103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.498176 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.498345 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.498406 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.498445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.498671 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.498759 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.498817 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.498868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.498915 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.498966 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.499091 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.499197 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.499244 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.499293 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.531823 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.534029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.534100 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.534120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.534162 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.535224 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.64:6443: connect: connection refused" node="crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601163 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601269 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601419 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601428 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601463 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601465 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601563 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601614 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601726 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601755 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601812 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601829 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601870 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601892 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601926 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601960 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.602022 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.601977 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.602026 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.602105 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.602148 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.602172 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.602249 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.602284 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.602305 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.602325 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.602360 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.602360 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.602321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.659576 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="800ms" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.769984 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.800364 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.817120 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: W0320 14:51:19.824529 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-0139bd69020368b96bc3b484dc59e144c5871d1cdf8446ed9a2e4370917508f4 WatchSource:0}: Error finding container 0139bd69020368b96bc3b484dc59e144c5871d1cdf8446ed9a2e4370917508f4: Status 404 returned error can't find the container with id 0139bd69020368b96bc3b484dc59e144c5871d1cdf8446ed9a2e4370917508f4 Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.840762 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: W0320 14:51:19.844604 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-7d2249f6939fc387c2f80a09152aa91023c73300d95b4287ac7d71872b617b93 WatchSource:0}: Error finding container 7d2249f6939fc387c2f80a09152aa91023c73300d95b4287ac7d71872b617b93: Status 404 returned error can't find the container with id 7d2249f6939fc387c2f80a09152aa91023c73300d95b4287ac7d71872b617b93 Mar 20 14:51:19 crc kubenswrapper[4764]: W0320 14:51:19.847321 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-43523810b79fa75d10e714b431afde61bd4abd4b566d42569e2a290ba466e867 WatchSource:0}: Error finding container 43523810b79fa75d10e714b431afde61bd4abd4b566d42569e2a290ba466e867: Status 404 returned error can't find the container with id 43523810b79fa75d10e714b431afde61bd4abd4b566d42569e2a290ba466e867 Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.849161 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:51:19 crc kubenswrapper[4764]: W0320 14:51:19.868225 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-899e05ba8357335381f847d1347649db49d55aeeb4ce0007db660c28afec88d9 WatchSource:0}: Error finding container 899e05ba8357335381f847d1347649db49d55aeeb4ce0007db660c28afec88d9: Status 404 returned error can't find the container with id 899e05ba8357335381f847d1347649db49d55aeeb4ce0007db660c28afec88d9 Mar 20 14:51:19 crc kubenswrapper[4764]: W0320 14:51:19.872609 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-9b0cfea19e6afa4da1013a87b0da745765f76fd6a8f8bc4c7656e2b8f088d7db WatchSource:0}: Error finding container 9b0cfea19e6afa4da1013a87b0da745765f76fd6a8f8bc4c7656e2b8f088d7db: Status 404 returned error can't find the container with id 9b0cfea19e6afa4da1013a87b0da745765f76fd6a8f8bc4c7656e2b8f088d7db Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.875031 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.64:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e9439c3ddc210 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.050691088 +0000 UTC m=+0.666880257,LastTimestamp:2026-03-20 14:51:19.050691088 +0000 UTC m=+0.666880257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:19 crc kubenswrapper[4764]: W0320 14:51:19.888074 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.888215 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.936173 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.938555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.938607 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.938616 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:19 crc kubenswrapper[4764]: I0320 14:51:19.938648 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:51:19 crc kubenswrapper[4764]: E0320 14:51:19.939256 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.64:6443: connect: connection refused" node="crc" Mar 20 14:51:20 crc kubenswrapper[4764]: I0320 14:51:20.052743 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:20 crc kubenswrapper[4764]: I0320 14:51:20.130123 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7d2249f6939fc387c2f80a09152aa91023c73300d95b4287ac7d71872b617b93"} Mar 20 14:51:20 crc kubenswrapper[4764]: I0320 14:51:20.131778 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0139bd69020368b96bc3b484dc59e144c5871d1cdf8446ed9a2e4370917508f4"} Mar 20 14:51:20 crc kubenswrapper[4764]: I0320 14:51:20.133188 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9b0cfea19e6afa4da1013a87b0da745765f76fd6a8f8bc4c7656e2b8f088d7db"} Mar 20 14:51:20 crc kubenswrapper[4764]: I0320 14:51:20.134228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"899e05ba8357335381f847d1347649db49d55aeeb4ce0007db660c28afec88d9"} Mar 20 14:51:20 crc kubenswrapper[4764]: I0320 14:51:20.135001 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"43523810b79fa75d10e714b431afde61bd4abd4b566d42569e2a290ba466e867"} Mar 20 14:51:20 crc kubenswrapper[4764]: W0320 14:51:20.389041 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:20 crc kubenswrapper[4764]: E0320 14:51:20.389193 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Mar 20 14:51:20 crc kubenswrapper[4764]: E0320 14:51:20.461160 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="1.6s" Mar 20 14:51:20 crc kubenswrapper[4764]: W0320 14:51:20.507247 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:20 crc kubenswrapper[4764]: E0320 14:51:20.507480 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Mar 20 14:51:20 crc kubenswrapper[4764]: W0320 14:51:20.716598 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:20 crc kubenswrapper[4764]: E0320 14:51:20.716705 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Mar 20 14:51:20 crc kubenswrapper[4764]: I0320 14:51:20.739666 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:20 crc kubenswrapper[4764]: I0320 14:51:20.741730 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:20 crc kubenswrapper[4764]: I0320 14:51:20.741806 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:20 crc kubenswrapper[4764]: I0320 14:51:20.741828 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:20 crc kubenswrapper[4764]: I0320 14:51:20.741880 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:51:20 crc kubenswrapper[4764]: E0320 14:51:20.742807 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.64:6443: connect: connection refused" node="crc" Mar 20 14:51:20 crc kubenswrapper[4764]: I0320 14:51:20.988393 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 14:51:20 crc kubenswrapper[4764]: E0320 14:51:20.989729 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.053000 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.141912 4764 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4" exitCode=0 Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.142058 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4"} Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.142116 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.144023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.144071 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.144090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.146624 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64"} Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.146697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31"} Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.149486 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe" exitCode=0 Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.149600 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe"} Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.149682 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.151270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.151309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.151328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.153138 4764 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a" exitCode=0 Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.153282 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a"} Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.153310 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.153587 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.155530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.155570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.155582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.155675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.155610 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.155722 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.159950 4764 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7" exitCode=0 Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.160036 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7"} Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.160178 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.163315 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.163371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:21 crc kubenswrapper[4764]: I0320 14:51:21.163446 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.052524 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:22 crc kubenswrapper[4764]: E0320 14:51:22.062409 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="3.2s" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.168576 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.168875 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9e81607c883d0ae10f09df76916ee9f179522d335e04a674d5d91e8fa53e1493"} Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.168993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e5d48409f7efcb41bb9e49779dd73c409e6c5724c3169e0e236c8432c8810a18"} Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.169017 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cb07b2638b074959f3554a18ae97b2e3003af3363c035859fbfc11d0be25bd8c"} Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.169441 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.169481 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.169491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.172683 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"177b2306e32b949702cfcfda1dedf244c68b4fb5e84f47bfc1ea60c528c8a08c"} Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.172731 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a885928113c47ef88cb200361015553722b14e453a1fe597dc7cbbc701585878"} Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.172793 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.174246 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.174309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.174330 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.175894 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a"} Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.175939 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6"} Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.175959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be"} Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.177977 4764 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c" exitCode=0 Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.178053 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c"} Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.178128 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.179594 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.179681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.179703 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.180658 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a0332bd896c2dc6d02397939e2edb7b192c381d5581f65b8780caa0fadd4493d"} Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.180698 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.181443 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.181478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.181488 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.342968 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.345875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.345928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.345944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:22 crc kubenswrapper[4764]: I0320 14:51:22.345978 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:51:22 crc kubenswrapper[4764]: E0320 14:51:22.346573 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.64:6443: connect: connection refused" node="crc" Mar 20 14:51:22 crc kubenswrapper[4764]: W0320 14:51:22.481059 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.64:6443: connect: connection refused Mar 20 14:51:22 crc kubenswrapper[4764]: E0320 14:51:22.481162 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.64:6443: connect: connection refused" logger="UnhandledError" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.189982 4764 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97" exitCode=0 Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.190090 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97"} Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.190214 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.192066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.192124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.192138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.200465 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"90f5b91a3e3728106fa78942e4a1653f0c67f64139b0778f8721c77fc5174513"} Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.200719 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.200568 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.200868 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.200692 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.200631 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.200777 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74"} Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.202242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.202300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.202317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.203549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.203560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.203582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.203593 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.203602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.203609 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.203721 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.203773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.203803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:23 crc kubenswrapper[4764]: I0320 14:51:23.578016 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:51:24 crc kubenswrapper[4764]: I0320 14:51:24.209997 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54"} Mar 20 14:51:24 crc kubenswrapper[4764]: I0320 14:51:24.210093 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:24 crc kubenswrapper[4764]: I0320 14:51:24.210102 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 14:51:24 crc kubenswrapper[4764]: I0320 14:51:24.210103 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9"} Mar 20 14:51:24 crc kubenswrapper[4764]: I0320 14:51:24.210414 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:24 crc kubenswrapper[4764]: I0320 14:51:24.210427 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb"} Mar 20 14:51:24 crc kubenswrapper[4764]: I0320 14:51:24.215522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:24 crc kubenswrapper[4764]: I0320 14:51:24.215695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:24 crc kubenswrapper[4764]: I0320 14:51:24.215739 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:24 crc kubenswrapper[4764]: I0320 14:51:24.217468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:24 crc kubenswrapper[4764]: I0320 14:51:24.217553 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:24 crc kubenswrapper[4764]: I0320 14:51:24.217575 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.014840 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.219423 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e"} Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.219499 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4"} Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.220802 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.222261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.222329 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.222354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.516309 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.516577 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.516645 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.518454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.518510 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.518528 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.547110 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.548607 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.548660 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.548679 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.548722 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:51:25 crc kubenswrapper[4764]: I0320 14:51:25.847035 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:26 crc kubenswrapper[4764]: I0320 14:51:26.222769 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 14:51:26 crc kubenswrapper[4764]: I0320 14:51:26.222866 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:26 crc kubenswrapper[4764]: I0320 14:51:26.223637 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:26 crc kubenswrapper[4764]: I0320 14:51:26.224620 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:26 crc kubenswrapper[4764]: I0320 14:51:26.224695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:26 crc kubenswrapper[4764]: I0320 14:51:26.224714 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:26 crc kubenswrapper[4764]: I0320 14:51:26.225511 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:26 crc kubenswrapper[4764]: I0320 14:51:26.225571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:26 crc kubenswrapper[4764]: I0320 14:51:26.225591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:27 crc kubenswrapper[4764]: I0320 14:51:27.240616 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:27 crc kubenswrapper[4764]: I0320 14:51:27.240905 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:27 crc kubenswrapper[4764]: I0320 14:51:27.242605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:27 crc kubenswrapper[4764]: I0320 14:51:27.242825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:27 crc kubenswrapper[4764]: I0320 14:51:27.242969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:28 crc kubenswrapper[4764]: I0320 14:51:28.158160 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 14:51:28 crc kubenswrapper[4764]: I0320 14:51:28.158443 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:28 crc kubenswrapper[4764]: I0320 14:51:28.160139 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:28 crc kubenswrapper[4764]: I0320 14:51:28.160176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:28 crc kubenswrapper[4764]: I0320 14:51:28.160188 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:28 crc kubenswrapper[4764]: I0320 14:51:28.469355 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:28 crc kubenswrapper[4764]: I0320 14:51:28.469666 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:28 crc kubenswrapper[4764]: I0320 14:51:28.471808 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:28 crc kubenswrapper[4764]: I0320 14:51:28.471868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:28 crc kubenswrapper[4764]: I0320 14:51:28.471887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:29 crc kubenswrapper[4764]: I0320 14:51:29.224144 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 14:51:29 crc kubenswrapper[4764]: I0320 14:51:29.224514 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:29 crc kubenswrapper[4764]: I0320 14:51:29.226568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:29 crc kubenswrapper[4764]: I0320 14:51:29.226638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:29 crc kubenswrapper[4764]: I0320 14:51:29.226660 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:29 crc kubenswrapper[4764]: E0320 14:51:29.242666 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 14:51:30 crc kubenswrapper[4764]: I0320 14:51:30.185757 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:30 crc kubenswrapper[4764]: I0320 14:51:30.186086 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:30 crc kubenswrapper[4764]: I0320 14:51:30.187855 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:30 crc kubenswrapper[4764]: I0320 14:51:30.187909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:30 crc kubenswrapper[4764]: I0320 14:51:30.187928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:31 crc kubenswrapper[4764]: I0320 14:51:31.341259 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:31 crc kubenswrapper[4764]: I0320 14:51:31.343709 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:31 crc kubenswrapper[4764]: I0320 14:51:31.347088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:31 crc kubenswrapper[4764]: I0320 14:51:31.347129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:31 crc kubenswrapper[4764]: I0320 14:51:31.347141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:31 crc kubenswrapper[4764]: I0320 14:51:31.348832 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:31 crc kubenswrapper[4764]: I0320 14:51:31.653613 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:32 crc kubenswrapper[4764]: I0320 14:51:32.242655 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:32 crc kubenswrapper[4764]: I0320 14:51:32.244398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:32 crc kubenswrapper[4764]: I0320 14:51:32.244938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:32 crc kubenswrapper[4764]: I0320 14:51:32.244964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:32 crc kubenswrapper[4764]: I0320 14:51:32.247958 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:32 crc kubenswrapper[4764]: W0320 14:51:32.841542 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 14:51:32 crc kubenswrapper[4764]: I0320 14:51:32.841721 4764 trace.go:236] Trace[790864258]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 14:51:22.839) (total time: 10002ms): Mar 20 14:51:32 crc kubenswrapper[4764]: Trace[790864258]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (14:51:32.841) Mar 20 14:51:32 crc kubenswrapper[4764]: Trace[790864258]: [10.002348844s] [10.002348844s] END Mar 20 14:51:32 crc kubenswrapper[4764]: E0320 14:51:32.841767 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 14:51:32 crc kubenswrapper[4764]: I0320 14:51:32.964454 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 20 14:51:32 crc kubenswrapper[4764]: I0320 14:51:32.964604 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 20 14:51:33 crc kubenswrapper[4764]: W0320 14:51:33.004221 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 14:51:33 crc kubenswrapper[4764]: I0320 14:51:33.004361 4764 trace.go:236] Trace[452179939]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 14:51:23.002) (total time: 10001ms): Mar 20 14:51:33 crc kubenswrapper[4764]: Trace[452179939]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:51:33.004) Mar 20 14:51:33 crc kubenswrapper[4764]: Trace[452179939]: [10.001493592s] [10.001493592s] END Mar 20 14:51:33 crc kubenswrapper[4764]: E0320 14:51:33.004441 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 14:51:33 crc kubenswrapper[4764]: I0320 14:51:33.053548 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 14:51:33 crc kubenswrapper[4764]: W0320 14:51:33.129955 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:33Z is after 2026-02-23T05:33:13Z Mar 20 14:51:33 crc kubenswrapper[4764]: E0320 14:51:33.130082 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 14:51:33 crc kubenswrapper[4764]: E0320 14:51:33.132079 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:33Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e9439c3ddc210 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.050691088 +0000 UTC m=+0.666880257,LastTimestamp:2026-03-20 14:51:19.050691088 +0000 UTC m=+0.666880257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:33 crc kubenswrapper[4764]: E0320 14:51:33.139699 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:33Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 14:51:33 crc kubenswrapper[4764]: I0320 14:51:33.148432 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 14:51:33 crc kubenswrapper[4764]: I0320 14:51:33.148492 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 14:51:33 crc kubenswrapper[4764]: E0320 14:51:33.150330 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 14:51:33 crc kubenswrapper[4764]: W0320 14:51:33.153364 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:33Z is after 2026-02-23T05:33:13Z Mar 20 14:51:33 crc kubenswrapper[4764]: E0320 14:51:33.153493 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 14:51:33 crc kubenswrapper[4764]: E0320 14:51:33.154069 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:33Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 14:51:33 crc kubenswrapper[4764]: I0320 14:51:33.157011 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 14:51:33 crc kubenswrapper[4764]: I0320 14:51:33.157159 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 14:51:33 crc kubenswrapper[4764]: I0320 14:51:33.185999 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:51:33 crc kubenswrapper[4764]: I0320 14:51:33.186093 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:51:33 crc kubenswrapper[4764]: I0320 14:51:33.245273 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:33 crc kubenswrapper[4764]: I0320 14:51:33.246075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:33 crc kubenswrapper[4764]: I0320 14:51:33.246104 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:33 crc kubenswrapper[4764]: I0320 14:51:33.246112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.059694 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:34Z is after 2026-02-23T05:33:13Z Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.250952 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.253406 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="90f5b91a3e3728106fa78942e4a1653f0c67f64139b0778f8721c77fc5174513" exitCode=255 Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.253425 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"90f5b91a3e3728106fa78942e4a1653f0c67f64139b0778f8721c77fc5174513"} Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.253609 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.253668 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.255013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.255053 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.255068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.255526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.255571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.255590 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:34 crc kubenswrapper[4764]: I0320 14:51:34.256368 4764 scope.go:117] "RemoveContainer" containerID="90f5b91a3e3728106fa78942e4a1653f0c67f64139b0778f8721c77fc5174513" Mar 20 14:51:35 crc kubenswrapper[4764]: I0320 14:51:35.058751 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:35Z is after 2026-02-23T05:33:13Z Mar 20 14:51:35 crc kubenswrapper[4764]: I0320 14:51:35.260893 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 14:51:35 crc kubenswrapper[4764]: I0320 14:51:35.261856 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 14:51:35 crc kubenswrapper[4764]: I0320 14:51:35.265269 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6fb71b9e042f8875226ecce1b669d27f6e500792f9df7caa47bb55e15d67045e" exitCode=255 Mar 20 14:51:35 crc kubenswrapper[4764]: I0320 14:51:35.265370 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6fb71b9e042f8875226ecce1b669d27f6e500792f9df7caa47bb55e15d67045e"} Mar 20 14:51:35 crc kubenswrapper[4764]: I0320 14:51:35.265556 4764 scope.go:117] "RemoveContainer" containerID="90f5b91a3e3728106fa78942e4a1653f0c67f64139b0778f8721c77fc5174513" Mar 20 14:51:35 crc kubenswrapper[4764]: I0320 14:51:35.265674 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:35 crc kubenswrapper[4764]: I0320 14:51:35.267430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:35 crc kubenswrapper[4764]: I0320 14:51:35.267476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:35 crc kubenswrapper[4764]: I0320 14:51:35.267494 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:35 crc kubenswrapper[4764]: I0320 14:51:35.268422 4764 scope.go:117] "RemoveContainer" containerID="6fb71b9e042f8875226ecce1b669d27f6e500792f9df7caa47bb55e15d67045e" Mar 20 14:51:35 crc kubenswrapper[4764]: E0320 14:51:35.268749 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:51:35 crc kubenswrapper[4764]: I0320 14:51:35.856846 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:36 crc kubenswrapper[4764]: I0320 14:51:36.057724 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:36Z is after 2026-02-23T05:33:13Z Mar 20 14:51:36 crc kubenswrapper[4764]: W0320 14:51:36.269067 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:36Z is after 2026-02-23T05:33:13Z Mar 20 14:51:36 crc kubenswrapper[4764]: E0320 14:51:36.269457 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 14:51:36 crc kubenswrapper[4764]: I0320 14:51:36.273891 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 14:51:36 crc kubenswrapper[4764]: I0320 14:51:36.280779 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:36 crc kubenswrapper[4764]: I0320 14:51:36.282200 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:36 crc kubenswrapper[4764]: I0320 14:51:36.282271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:36 crc kubenswrapper[4764]: I0320 14:51:36.282292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:36 crc kubenswrapper[4764]: I0320 14:51:36.283453 4764 scope.go:117] "RemoveContainer" containerID="6fb71b9e042f8875226ecce1b669d27f6e500792f9df7caa47bb55e15d67045e" Mar 20 14:51:36 crc kubenswrapper[4764]: E0320 14:51:36.283836 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:51:36 crc kubenswrapper[4764]: I0320 14:51:36.288697 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:36 crc kubenswrapper[4764]: W0320 14:51:36.632667 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:36Z is after 2026-02-23T05:33:13Z Mar 20 14:51:36 crc kubenswrapper[4764]: E0320 14:51:36.632806 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 14:51:36 crc kubenswrapper[4764]: W0320 14:51:36.999214 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:36Z is after 2026-02-23T05:33:13Z Mar 20 14:51:36 crc kubenswrapper[4764]: E0320 14:51:36.999342 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 14:51:37 crc kubenswrapper[4764]: I0320 14:51:37.058145 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:37Z is after 2026-02-23T05:33:13Z Mar 20 14:51:37 crc kubenswrapper[4764]: I0320 14:51:37.284878 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:37 crc kubenswrapper[4764]: I0320 14:51:37.286574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:37 crc kubenswrapper[4764]: I0320 14:51:37.286651 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:37 crc kubenswrapper[4764]: I0320 14:51:37.286672 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:37 crc kubenswrapper[4764]: I0320 14:51:37.287843 4764 scope.go:117] "RemoveContainer" containerID="6fb71b9e042f8875226ecce1b669d27f6e500792f9df7caa47bb55e15d67045e" Mar 20 14:51:37 crc kubenswrapper[4764]: E0320 14:51:37.288312 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:51:38 crc kubenswrapper[4764]: I0320 14:51:38.058129 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:51:38Z is after 2026-02-23T05:33:13Z Mar 20 14:51:38 crc kubenswrapper[4764]: I0320 14:51:38.470004 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:38 crc kubenswrapper[4764]: I0320 14:51:38.470345 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:38 crc kubenswrapper[4764]: I0320 14:51:38.472342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:38 crc kubenswrapper[4764]: I0320 14:51:38.472445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:38 crc kubenswrapper[4764]: I0320 14:51:38.472467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:38 crc kubenswrapper[4764]: I0320 14:51:38.474026 4764 scope.go:117] "RemoveContainer" containerID="6fb71b9e042f8875226ecce1b669d27f6e500792f9df7caa47bb55e15d67045e" Mar 20 14:51:38 crc kubenswrapper[4764]: E0320 14:51:38.474464 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.078786 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:39 crc kubenswrapper[4764]: E0320 14:51:39.243000 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.263987 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.264261 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.265824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.265892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.265911 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.283839 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.290320 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.291421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.291475 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.291498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.539991 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.542195 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.542280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.542301 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:39 crc kubenswrapper[4764]: I0320 14:51:39.542352 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:51:39 crc kubenswrapper[4764]: E0320 14:51:39.551241 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 14:51:39 crc kubenswrapper[4764]: E0320 14:51:39.562352 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 14:51:40 crc kubenswrapper[4764]: I0320 14:51:40.059468 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:40 crc kubenswrapper[4764]: W0320 14:51:40.388547 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:40 crc kubenswrapper[4764]: E0320 14:51:40.388638 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 14:51:41 crc kubenswrapper[4764]: I0320 14:51:41.058224 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:41 crc kubenswrapper[4764]: I0320 14:51:41.510777 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 14:51:41 crc kubenswrapper[4764]: I0320 14:51:41.575794 4764 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 14:51:42 crc kubenswrapper[4764]: I0320 14:51:42.060806 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:42 crc kubenswrapper[4764]: I0320 14:51:42.964646 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:42 crc kubenswrapper[4764]: I0320 14:51:42.965088 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:42 crc kubenswrapper[4764]: I0320 14:51:42.967205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:42 crc kubenswrapper[4764]: I0320 14:51:42.967280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:42 crc kubenswrapper[4764]: I0320 14:51:42.967298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:42 crc kubenswrapper[4764]: I0320 14:51:42.968260 4764 scope.go:117] "RemoveContainer" containerID="6fb71b9e042f8875226ecce1b669d27f6e500792f9df7caa47bb55e15d67045e" Mar 20 14:51:42 crc kubenswrapper[4764]: E0320 14:51:42.968618 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:51:43 crc kubenswrapper[4764]: I0320 14:51:43.060486 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.144072 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c3ddc210 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.050691088 +0000 UTC m=+0.666880257,LastTimestamp:2026-03-20 14:51:19.050691088 +0000 UTC m=+0.666880257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.151709 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86d557b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127209339 +0000 UTC m=+0.743398478,LastTimestamp:2026-03-20 14:51:19.127209339 +0000 UTC m=+0.743398478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.160777 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86de433 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127245875 +0000 UTC m=+0.743435014,LastTimestamp:2026-03-20 14:51:19.127245875 +0000 UTC m=+0.743435014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.168324 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86e1e97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127260823 +0000 UTC m=+0.743449962,LastTimestamp:2026-03-20 14:51:19.127260823 +0000 UTC m=+0.743449962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.176439 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439cec9382a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.233894442 +0000 UTC m=+0.850083601,LastTimestamp:2026-03-20 14:51:19.233894442 +0000 UTC m=+0.850083601,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.184428 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86d557b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86d557b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127209339 +0000 UTC m=+0.743398478,LastTimestamp:2026-03-20 14:51:19.330012242 +0000 UTC m=+0.946201401,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: I0320 14:51:43.186458 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:51:43 crc kubenswrapper[4764]: I0320 14:51:43.186767 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.192074 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86de433\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86de433 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127245875 +0000 UTC m=+0.743435014,LastTimestamp:2026-03-20 14:51:19.330044789 +0000 UTC m=+0.946233958,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.198699 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86e1e97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86e1e97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127260823 +0000 UTC m=+0.743449962,LastTimestamp:2026-03-20 14:51:19.330061597 +0000 UTC m=+0.946250766,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.207020 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86d557b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86d557b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127209339 +0000 UTC m=+0.743398478,LastTimestamp:2026-03-20 14:51:19.428846181 +0000 UTC m=+1.045035310,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.214828 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86de433\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86de433 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127245875 +0000 UTC m=+0.743435014,LastTimestamp:2026-03-20 14:51:19.428883497 +0000 UTC m=+1.045072626,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.221205 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86e1e97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86e1e97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127260823 +0000 UTC m=+0.743449962,LastTimestamp:2026-03-20 14:51:19.428901175 +0000 UTC m=+1.045090314,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.224273 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86d557b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86d557b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127209339 +0000 UTC m=+0.743398478,LastTimestamp:2026-03-20 14:51:19.431556361 +0000 UTC m=+1.047745520,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.229347 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86de433\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86de433 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127245875 +0000 UTC m=+0.743435014,LastTimestamp:2026-03-20 14:51:19.431585928 +0000 UTC m=+1.047775087,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.230936 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86e1e97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86e1e97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127260823 +0000 UTC m=+0.743449962,LastTimestamp:2026-03-20 14:51:19.431602566 +0000 UTC m=+1.047791725,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.238215 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86d557b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86d557b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127209339 +0000 UTC m=+0.743398478,LastTimestamp:2026-03-20 14:51:19.431665468 +0000 UTC m=+1.047854607,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.245748 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86de433\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86de433 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127245875 +0000 UTC m=+0.743435014,LastTimestamp:2026-03-20 14:51:19.431691805 +0000 UTC m=+1.047880934,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.252987 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86e1e97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86e1e97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127260823 +0000 UTC m=+0.743449962,LastTimestamp:2026-03-20 14:51:19.43173645 +0000 UTC m=+1.047925579,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.260425 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86d557b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86d557b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127209339 +0000 UTC m=+0.743398478,LastTimestamp:2026-03-20 14:51:19.433809035 +0000 UTC m=+1.049998194,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.267602 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86de433\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86de433 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127245875 +0000 UTC m=+0.743435014,LastTimestamp:2026-03-20 14:51:19.433831963 +0000 UTC m=+1.050021122,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.275269 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86e1e97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86e1e97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127260823 +0000 UTC m=+0.743449962,LastTimestamp:2026-03-20 14:51:19.43385135 +0000 UTC m=+1.050040519,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.282881 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86d557b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86d557b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127209339 +0000 UTC m=+0.743398478,LastTimestamp:2026-03-20 14:51:19.433977425 +0000 UTC m=+1.050166594,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.297696 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86de433\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86de433 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127245875 +0000 UTC m=+0.743435014,LastTimestamp:2026-03-20 14:51:19.434009382 +0000 UTC m=+1.050198541,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.306023 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86e1e97\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86e1e97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127260823 +0000 UTC m=+0.743449962,LastTimestamp:2026-03-20 14:51:19.43402517 +0000 UTC m=+1.050214329,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.313223 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86d557b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86d557b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127209339 +0000 UTC m=+0.743398478,LastTimestamp:2026-03-20 14:51:19.435743607 +0000 UTC m=+1.051932776,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.320362 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9439c86d557b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9439c86d557b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.127209339 +0000 UTC m=+0.743398478,LastTimestamp:2026-03-20 14:51:19.435767964 +0000 UTC m=+1.051957093,Count:9,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.328625 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9439f2aa07b0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.835830192 +0000 UTC m=+1.452019351,LastTimestamp:2026-03-20 14:51:19.835830192 +0000 UTC m=+1.452019351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.331783 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9439f3c651fa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.854461434 +0000 UTC m=+1.470650603,LastTimestamp:2026-03-20 14:51:19.854461434 +0000 UTC m=+1.470650603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.335867 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9439f3ca2d7a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.854714234 +0000 UTC m=+1.470903373,LastTimestamp:2026-03-20 14:51:19.854714234 +0000 UTC m=+1.470903373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.341598 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9439f503e590 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.875274128 +0000 UTC m=+1.491463287,LastTimestamp:2026-03-20 14:51:19.875274128 +0000 UTC m=+1.491463287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.348481 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9439f555e03a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:19.880646714 +0000 UTC m=+1.496835883,LastTimestamp:2026-03-20 14:51:19.880646714 +0000 UTC m=+1.496835883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.355781 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a194cf9d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.484043222 +0000 UTC m=+2.100232361,LastTimestamp:2026-03-20 14:51:20.484043222 +0000 UTC m=+2.100232361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.363099 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943a19ab5c38 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.490228792 +0000 UTC m=+2.106417931,LastTimestamp:2026-03-20 14:51:20.490228792 +0000 UTC m=+2.106417931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.370102 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e943a19b674aa openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.490955946 +0000 UTC m=+2.107145085,LastTimestamp:2026-03-20 14:51:20.490955946 +0000 UTC m=+2.107145085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.376543 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e943a1a197415 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.497443861 +0000 UTC m=+2.113633000,LastTimestamp:2026-03-20 14:51:20.497443861 +0000 UTC m=+2.113633000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.383515 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a1a2ef180 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.498852224 +0000 UTC m=+2.115041363,LastTimestamp:2026-03-20 14:51:20.498852224 +0000 UTC m=+2.115041363,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.390298 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a1a52dbed openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.501205997 +0000 UTC m=+2.117395166,LastTimestamp:2026-03-20 14:51:20.501205997 +0000 UTC m=+2.117395166,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.397904 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a1acc550a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.509166858 +0000 UTC m=+2.125356017,LastTimestamp:2026-03-20 14:51:20.509166858 +0000 UTC m=+2.125356017,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.404962 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943a1b2b908b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.515408011 +0000 UTC m=+2.131597160,LastTimestamp:2026-03-20 14:51:20.515408011 +0000 UTC m=+2.131597160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.412763 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e943a1b46cfd1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.517193681 +0000 UTC m=+2.133383040,LastTimestamp:2026-03-20 14:51:20.517193681 +0000 UTC m=+2.133383040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.418527 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e943a1b66a69c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.519280284 +0000 UTC m=+2.135469433,LastTimestamp:2026-03-20 14:51:20.519280284 +0000 UTC m=+2.135469433,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.424217 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a1c37b786 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.532981638 +0000 UTC m=+2.149170797,LastTimestamp:2026-03-20 14:51:20.532981638 +0000 UTC m=+2.149170797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.432182 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a311dcd0d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.883604749 +0000 UTC m=+2.499793908,LastTimestamp:2026-03-20 14:51:20.883604749 +0000 UTC m=+2.499793908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.436226 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a3210f200 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.899539456 +0000 UTC m=+2.515728625,LastTimestamp:2026-03-20 14:51:20.899539456 +0000 UTC m=+2.515728625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.442169 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a32310d8d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.901643661 +0000 UTC m=+2.517832830,LastTimestamp:2026-03-20 14:51:20.901643661 +0000 UTC m=+2.517832830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.448006 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e943a40cab22b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.146593835 +0000 UTC m=+2.762783004,LastTimestamp:2026-03-20 14:51:21.146593835 +0000 UTC m=+2.762783004,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.455169 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a412df5b0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.153099184 +0000 UTC m=+2.769288343,LastTimestamp:2026-03-20 14:51:21.153099184 +0000 UTC m=+2.769288343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.463283 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943a41a36985 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.160796549 +0000 UTC m=+2.776985718,LastTimestamp:2026-03-20 14:51:21.160796549 +0000 UTC m=+2.776985718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.470748 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e943a41eb2095 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.165496469 +0000 UTC m=+2.781685628,LastTimestamp:2026-03-20 14:51:21.165496469 +0000 UTC m=+2.781685628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.477963 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a433bdf50 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.187565392 +0000 UTC m=+2.803754531,LastTimestamp:2026-03-20 14:51:21.187565392 +0000 UTC m=+2.803754531,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.482992 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a457afe58 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.225256536 +0000 UTC m=+2.841445695,LastTimestamp:2026-03-20 14:51:21.225256536 +0000 UTC m=+2.841445695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.490103 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a45985272 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.22717861 +0000 UTC m=+2.843367779,LastTimestamp:2026-03-20 14:51:21.22717861 +0000 UTC m=+2.843367779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.497154 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e943a5036871e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.405318942 +0000 UTC m=+3.021508111,LastTimestamp:2026-03-20 14:51:21.405318942 +0000 UTC m=+3.021508111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.503235 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e943a510d814b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.419407691 +0000 UTC m=+3.035596840,LastTimestamp:2026-03-20 14:51:21.419407691 +0000 UTC m=+3.035596840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.519512 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e943a513bfe5c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.422454364 +0000 UTC m=+3.038643493,LastTimestamp:2026-03-20 14:51:21.422454364 +0000 UTC m=+3.038643493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.530432 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943a52131450 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.436550224 +0000 UTC m=+3.052739343,LastTimestamp:2026-03-20 14:51:21.436550224 +0000 UTC m=+3.052739343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: W0320 14:51:43.530651 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.530895 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.537566 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e943a5216488c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.436760204 +0000 UTC m=+3.052949333,LastTimestamp:2026-03-20 14:51:21.436760204 +0000 UTC m=+3.052949333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.543273 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a52cf245f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.448875103 +0000 UTC m=+3.065064232,LastTimestamp:2026-03-20 14:51:21.448875103 +0000 UTC m=+3.065064232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.549830 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a52fc17d3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.451821011 +0000 UTC m=+3.068010130,LastTimestamp:2026-03-20 14:51:21.451821011 +0000 UTC m=+3.068010130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.556074 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943a5311eebf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.453252287 +0000 UTC m=+3.069441416,LastTimestamp:2026-03-20 14:51:21.453252287 +0000 UTC m=+3.069441416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.561800 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e943a535039f6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.457334774 +0000 UTC m=+3.073523903,LastTimestamp:2026-03-20 14:51:21.457334774 +0000 UTC m=+3.073523903,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.568412 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a549c41f3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.479094771 +0000 UTC m=+3.095283900,LastTimestamp:2026-03-20 14:51:21.479094771 +0000 UTC m=+3.095283900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.572683 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a54b81274 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.48091762 +0000 UTC m=+3.097106749,LastTimestamp:2026-03-20 14:51:21.48091762 +0000 UTC m=+3.097106749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.577460 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a54cb29bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.482168763 +0000 UTC m=+3.098357892,LastTimestamp:2026-03-20 14:51:21.482168763 +0000 UTC m=+3.098357892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.581228 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e943a5e7bf105 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.644749061 +0000 UTC m=+3.260938190,LastTimestamp:2026-03-20 14:51:21.644749061 +0000 UTC m=+3.260938190,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.586060 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e943a5fa3f61f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.664149023 +0000 UTC m=+3.280338162,LastTimestamp:2026-03-20 14:51:21.664149023 +0000 UTC m=+3.280338162,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.591345 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e943a5fc19bf0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.666092016 +0000 UTC m=+3.282281155,LastTimestamp:2026-03-20 14:51:21.666092016 +0000 UTC m=+3.282281155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.595428 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a629954f4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.713784052 +0000 UTC m=+3.329973181,LastTimestamp:2026-03-20 14:51:21.713784052 +0000 UTC m=+3.329973181,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.600097 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a63c2c106 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.73327591 +0000 UTC m=+3.349465079,LastTimestamp:2026-03-20 14:51:21.73327591 +0000 UTC m=+3.349465079,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.604679 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a63de914c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.7350987 +0000 UTC m=+3.351287829,LastTimestamp:2026-03-20 14:51:21.7350987 +0000 UTC m=+3.351287829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.609566 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e943a6f274e37 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.924415031 +0000 UTC m=+3.540604200,LastTimestamp:2026-03-20 14:51:21.924415031 +0000 UTC m=+3.540604200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.613741 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e943a7088ca24 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.947580964 +0000 UTC m=+3.563770093,LastTimestamp:2026-03-20 14:51:21.947580964 +0000 UTC m=+3.563770093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.618627 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a736624f9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:21.995642105 +0000 UTC m=+3.611831234,LastTimestamp:2026-03-20 14:51:21.995642105 +0000 UTC m=+3.611831234,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.631213 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a74269a36 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:22.00825503 +0000 UTC m=+3.624444159,LastTimestamp:2026-03-20 14:51:22.00825503 +0000 UTC m=+3.624444159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.636405 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a743b3338 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:22.00960492 +0000 UTC m=+3.625794059,LastTimestamp:2026-03-20 14:51:22.00960492 +0000 UTC m=+3.625794059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.644460 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943a7e87cb30 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:22.18239672 +0000 UTC m=+3.798585859,LastTimestamp:2026-03-20 14:51:22.18239672 +0000 UTC m=+3.798585859,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.650113 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a814ee6d7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:22.228999895 +0000 UTC m=+3.845189024,LastTimestamp:2026-03-20 14:51:22.228999895 +0000 UTC m=+3.845189024,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.655181 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a821db74f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:22.242553679 +0000 UTC m=+3.858742808,LastTimestamp:2026-03-20 14:51:22.242553679 +0000 UTC m=+3.858742808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.660488 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a82332155 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:22.243957077 +0000 UTC m=+3.860146216,LastTimestamp:2026-03-20 14:51:22.243957077 +0000 UTC m=+3.860146216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.668296 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a8cd6fd15 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:22.422467861 +0000 UTC m=+4.038656980,LastTimestamp:2026-03-20 14:51:22.422467861 +0000 UTC m=+4.038656980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.675953 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943a8cd77b04 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:22.4225001 +0000 UTC m=+4.038689229,LastTimestamp:2026-03-20 14:51:22.4225001 +0000 UTC m=+4.038689229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.681561 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a8d6bbdc4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:22.432216516 +0000 UTC m=+4.048405635,LastTimestamp:2026-03-20 14:51:22.432216516 +0000 UTC m=+4.048405635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.688563 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943a8da1f0a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:22.435768486 +0000 UTC m=+4.051957615,LastTimestamp:2026-03-20 14:51:22.435768486 +0000 UTC m=+4.051957615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.695535 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943abae32127 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:23.195015463 +0000 UTC m=+4.811204622,LastTimestamp:2026-03-20 14:51:23.195015463 +0000 UTC m=+4.811204622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.703530 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943ac9f3bf31 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:23.447762737 +0000 UTC m=+5.063951906,LastTimestamp:2026-03-20 14:51:23.447762737 +0000 UTC m=+5.063951906,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.710156 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943acaab7fe7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:23.459805159 +0000 UTC m=+5.075994328,LastTimestamp:2026-03-20 14:51:23.459805159 +0000 UTC m=+5.075994328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.715203 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943acacbc28e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:23.461919374 +0000 UTC m=+5.078108543,LastTimestamp:2026-03-20 14:51:23.461919374 +0000 UTC m=+5.078108543,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.720544 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943adbf2097d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:23.749640573 +0000 UTC m=+5.365829732,LastTimestamp:2026-03-20 14:51:23.749640573 +0000 UTC m=+5.365829732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.727275 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943add201394 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:23.769435028 +0000 UTC m=+5.385624197,LastTimestamp:2026-03-20 14:51:23.769435028 +0000 UTC m=+5.385624197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.731884 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943add395f4d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:23.771092813 +0000 UTC m=+5.387281972,LastTimestamp:2026-03-20 14:51:23.771092813 +0000 UTC m=+5.387281972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.739165 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943aee04c0bc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:24.05285702 +0000 UTC m=+5.669046179,LastTimestamp:2026-03-20 14:51:24.05285702 +0000 UTC m=+5.669046179,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.745840 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943aef03d9f0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:24.069575152 +0000 UTC m=+5.685764321,LastTimestamp:2026-03-20 14:51:24.069575152 +0000 UTC m=+5.685764321,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.751998 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943aef1cf03a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:24.071219258 +0000 UTC m=+5.687408417,LastTimestamp:2026-03-20 14:51:24.071219258 +0000 UTC m=+5.687408417,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.759453 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943affd1cdd0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:24.35150792 +0000 UTC m=+5.967697079,LastTimestamp:2026-03-20 14:51:24.35150792 +0000 UTC m=+5.967697079,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.765978 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943b0083d276 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:24.363174518 +0000 UTC m=+5.979363697,LastTimestamp:2026-03-20 14:51:24.363174518 +0000 UTC m=+5.979363697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.772690 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943b009cf2ca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:24.364821194 +0000 UTC m=+5.981010373,LastTimestamp:2026-03-20 14:51:24.364821194 +0000 UTC m=+5.981010373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.778833 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943b0f451907 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:24.610722055 +0000 UTC m=+6.226911224,LastTimestamp:2026-03-20 14:51:24.610722055 +0000 UTC m=+6.226911224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.785718 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e943b10232d3f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:24.625276223 +0000 UTC m=+6.241465392,LastTimestamp:2026-03-20 14:51:24.625276223 +0000 UTC m=+6.241465392,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.796032 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 14:51:43 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-apiserver-crc.189e943d0132810e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 20 14:51:43 crc kubenswrapper[4764]: body: Mar 20 14:51:43 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:32.96455707 +0000 UTC m=+14.580746239,LastTimestamp:2026-03-20 14:51:32.96455707 +0000 UTC m=+14.580746239,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 14:51:43 crc kubenswrapper[4764]: > Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.803104 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943d0134391b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:32.964669723 +0000 UTC m=+14.580858892,LastTimestamp:2026-03-20 14:51:32.964669723 +0000 UTC m=+14.580858892,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.810669 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 14:51:43 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-apiserver-crc.189e943d0c28e0e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 14:51:43 crc kubenswrapper[4764]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 14:51:43 crc kubenswrapper[4764]: Mar 20 14:51:43 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:33.148475622 +0000 UTC m=+14.764664761,LastTimestamp:2026-03-20 14:51:33.148475622 +0000 UTC m=+14.764664761,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 14:51:43 crc kubenswrapper[4764]: > Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.817762 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943d0c2999a1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:33.148522913 +0000 UTC m=+14.764712052,LastTimestamp:2026-03-20 14:51:33.148522913 +0000 UTC m=+14.764712052,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.824959 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e943d0c28e0e6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 14:51:43 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-apiserver-crc.189e943d0c28e0e6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 14:51:43 crc kubenswrapper[4764]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 14:51:43 crc kubenswrapper[4764]: Mar 20 14:51:43 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:33.148475622 +0000 UTC m=+14.764664761,LastTimestamp:2026-03-20 14:51:33.157102262 +0000 UTC m=+14.773291401,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 14:51:43 crc kubenswrapper[4764]: > Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.831899 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e943d0c2999a1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943d0c2999a1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:33.148522913 +0000 UTC m=+14.764712052,LastTimestamp:2026-03-20 14:51:33.157214065 +0000 UTC m=+14.773403194,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.838684 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 14:51:43 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189e943d0e668f8d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 14:51:43 crc kubenswrapper[4764]: body: Mar 20 14:51:43 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:33.186072461 +0000 UTC m=+14.802261590,LastTimestamp:2026-03-20 14:51:33.186072461 +0000 UTC m=+14.802261590,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 14:51:43 crc kubenswrapper[4764]: > Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.843856 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943d0e678313 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:33.186134803 +0000 UTC m=+14.802323932,LastTimestamp:2026-03-20 14:51:33.186134803 +0000 UTC m=+14.802323932,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.852493 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e943a82332155\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e943a82332155 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:22.243957077 +0000 UTC m=+3.860146216,LastTimestamp:2026-03-20 14:51:34.257647075 +0000 UTC m=+15.873836214,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.862433 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e943d0e668f8d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 14:51:43 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189e943d0e668f8d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 14:51:43 crc kubenswrapper[4764]: body: Mar 20 14:51:43 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:33.186072461 +0000 UTC m=+14.802261590,LastTimestamp:2026-03-20 14:51:43.186710731 +0000 UTC m=+24.802899920,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 14:51:43 crc kubenswrapper[4764]: > Mar 20 14:51:43 crc kubenswrapper[4764]: E0320 14:51:43.868809 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e943d0e678313\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943d0e678313 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:33.186134803 +0000 UTC m=+14.802323932,LastTimestamp:2026-03-20 14:51:43.18701465 +0000 UTC m=+24.803203849,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:44 crc kubenswrapper[4764]: I0320 14:51:44.061328 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:44 crc kubenswrapper[4764]: W0320 14:51:44.368034 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 14:51:44 crc kubenswrapper[4764]: E0320 14:51:44.368133 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 14:51:45 crc kubenswrapper[4764]: I0320 14:51:45.059782 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:46 crc kubenswrapper[4764]: I0320 14:51:46.060620 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:46 crc kubenswrapper[4764]: I0320 14:51:46.551985 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:46 crc kubenswrapper[4764]: I0320 14:51:46.554194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:46 crc kubenswrapper[4764]: I0320 14:51:46.554281 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:46 crc kubenswrapper[4764]: I0320 14:51:46.554306 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:46 crc kubenswrapper[4764]: I0320 14:51:46.554353 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:51:46 crc kubenswrapper[4764]: E0320 14:51:46.564533 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 14:51:46 crc kubenswrapper[4764]: E0320 14:51:46.565034 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 14:51:47 crc kubenswrapper[4764]: I0320 14:51:47.059106 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:47 crc kubenswrapper[4764]: W0320 14:51:47.646716 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 14:51:47 crc kubenswrapper[4764]: E0320 14:51:47.646815 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 14:51:48 crc kubenswrapper[4764]: I0320 14:51:48.061153 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:49 crc kubenswrapper[4764]: I0320 14:51:49.059928 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:49 crc kubenswrapper[4764]: E0320 14:51:49.243210 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 14:51:50 crc kubenswrapper[4764]: I0320 14:51:50.057074 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:51 crc kubenswrapper[4764]: I0320 14:51:51.060023 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:51 crc kubenswrapper[4764]: I0320 14:51:51.846605 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:58254->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 20 14:51:51 crc kubenswrapper[4764]: I0320 14:51:51.846724 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:58254->192.168.126.11:10357: read: connection reset by peer" Mar 20 14:51:51 crc kubenswrapper[4764]: I0320 14:51:51.846834 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:51:51 crc kubenswrapper[4764]: I0320 14:51:51.847094 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:51 crc kubenswrapper[4764]: I0320 14:51:51.849135 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:51 crc kubenswrapper[4764]: I0320 14:51:51.849203 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:51 crc kubenswrapper[4764]: I0320 14:51:51.849224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:51 crc kubenswrapper[4764]: I0320 14:51:51.850058 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 14:51:51 crc kubenswrapper[4764]: I0320 14:51:51.850452 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64" gracePeriod=30 Mar 20 14:51:51 crc kubenswrapper[4764]: E0320 14:51:51.858856 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 14:51:51 crc kubenswrapper[4764]: &Event{ObjectMeta:{kube-controller-manager-crc.189e944166a8ed21 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:58254->192.168.126.11:10357: read: connection reset by peer Mar 20 14:51:51 crc kubenswrapper[4764]: body: Mar 20 14:51:51 crc kubenswrapper[4764]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:51.846685985 +0000 UTC m=+33.462875154,LastTimestamp:2026-03-20 14:51:51.846685985 +0000 UTC m=+33.462875154,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 14:51:51 crc kubenswrapper[4764]: > Mar 20 14:51:51 crc kubenswrapper[4764]: E0320 14:51:51.865925 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e944166aa66ad openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:58254->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:51.846782637 +0000 UTC m=+33.462971796,LastTimestamp:2026-03-20 14:51:51.846782637 +0000 UTC m=+33.462971796,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:51 crc kubenswrapper[4764]: E0320 14:51:51.873831 4764 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e944166e1e04c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:51.850418252 +0000 UTC m=+33.466607421,LastTimestamp:2026-03-20 14:51:51.850418252 +0000 UTC m=+33.466607421,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:51 crc kubenswrapper[4764]: E0320 14:51:51.881752 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e943a1a52dbed\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a1a52dbed openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.501205997 +0000 UTC m=+2.117395166,LastTimestamp:2026-03-20 14:51:51.873978435 +0000 UTC m=+33.490167564,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:52 crc kubenswrapper[4764]: I0320 14:51:52.060928 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:52 crc kubenswrapper[4764]: E0320 14:51:52.154482 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e943a311dcd0d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a311dcd0d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.883604749 +0000 UTC m=+2.499793908,LastTimestamp:2026-03-20 14:51:52.146373452 +0000 UTC m=+33.762562611,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:52 crc kubenswrapper[4764]: E0320 14:51:52.169592 4764 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e943a3210f200\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e943a3210f200 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:51:20.899539456 +0000 UTC m=+2.515728625,LastTimestamp:2026-03-20 14:51:52.161520531 +0000 UTC m=+33.777709690,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:51:52 crc kubenswrapper[4764]: I0320 14:51:52.343864 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 14:51:52 crc kubenswrapper[4764]: I0320 14:51:52.351024 4764 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64" exitCode=255 Mar 20 14:51:52 crc kubenswrapper[4764]: I0320 14:51:52.351153 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64"} Mar 20 14:51:52 crc kubenswrapper[4764]: I0320 14:51:52.351207 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"70543c47e9e79071fcf3bf18832d56a6ce2744f4336d0a99fc50e84a2d22b99b"} Mar 20 14:51:52 crc kubenswrapper[4764]: I0320 14:51:52.351477 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:52 crc kubenswrapper[4764]: I0320 14:51:52.354062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:52 crc kubenswrapper[4764]: I0320 14:51:52.354139 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:52 crc kubenswrapper[4764]: I0320 14:51:52.354161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:53 crc kubenswrapper[4764]: I0320 14:51:53.060214 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:53 crc kubenswrapper[4764]: I0320 14:51:53.565107 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:53 crc kubenswrapper[4764]: I0320 14:51:53.567351 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:53 crc kubenswrapper[4764]: I0320 14:51:53.567460 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:53 crc kubenswrapper[4764]: I0320 14:51:53.567483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:53 crc kubenswrapper[4764]: I0320 14:51:53.567926 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:51:53 crc kubenswrapper[4764]: E0320 14:51:53.572501 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 14:51:53 crc kubenswrapper[4764]: E0320 14:51:53.572688 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 14:51:54 crc kubenswrapper[4764]: I0320 14:51:54.060590 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:54 crc kubenswrapper[4764]: I0320 14:51:54.125948 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:54 crc kubenswrapper[4764]: I0320 14:51:54.128019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:54 crc kubenswrapper[4764]: I0320 14:51:54.128091 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:54 crc kubenswrapper[4764]: I0320 14:51:54.128115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:54 crc kubenswrapper[4764]: I0320 14:51:54.129297 4764 scope.go:117] "RemoveContainer" containerID="6fb71b9e042f8875226ecce1b669d27f6e500792f9df7caa47bb55e15d67045e" Mar 20 14:51:55 crc kubenswrapper[4764]: I0320 14:51:55.058757 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:55 crc kubenswrapper[4764]: I0320 14:51:55.361595 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 14:51:55 crc kubenswrapper[4764]: I0320 14:51:55.363567 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3e8a5a879f6f288f7d7d9bb0f1381f8768f97988dd145e1f192bbbd1a59ac312"} Mar 20 14:51:55 crc kubenswrapper[4764]: I0320 14:51:55.363781 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:55 crc kubenswrapper[4764]: I0320 14:51:55.364630 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:55 crc kubenswrapper[4764]: I0320 14:51:55.364680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:55 crc kubenswrapper[4764]: I0320 14:51:55.364696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:56 crc kubenswrapper[4764]: I0320 14:51:56.059364 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:56 crc kubenswrapper[4764]: I0320 14:51:56.369595 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 14:51:56 crc kubenswrapper[4764]: I0320 14:51:56.370494 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 14:51:56 crc kubenswrapper[4764]: I0320 14:51:56.373490 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3e8a5a879f6f288f7d7d9bb0f1381f8768f97988dd145e1f192bbbd1a59ac312" exitCode=255 Mar 20 14:51:56 crc kubenswrapper[4764]: I0320 14:51:56.373551 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3e8a5a879f6f288f7d7d9bb0f1381f8768f97988dd145e1f192bbbd1a59ac312"} Mar 20 14:51:56 crc kubenswrapper[4764]: I0320 14:51:56.373611 4764 scope.go:117] "RemoveContainer" containerID="6fb71b9e042f8875226ecce1b669d27f6e500792f9df7caa47bb55e15d67045e" Mar 20 14:51:56 crc kubenswrapper[4764]: I0320 14:51:56.373913 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:56 crc kubenswrapper[4764]: I0320 14:51:56.378965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:56 crc kubenswrapper[4764]: I0320 14:51:56.379041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:56 crc kubenswrapper[4764]: I0320 14:51:56.379063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:56 crc kubenswrapper[4764]: I0320 14:51:56.380231 4764 scope.go:117] "RemoveContainer" containerID="3e8a5a879f6f288f7d7d9bb0f1381f8768f97988dd145e1f192bbbd1a59ac312" Mar 20 14:51:56 crc kubenswrapper[4764]: E0320 14:51:56.380621 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:51:57 crc kubenswrapper[4764]: I0320 14:51:57.060482 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:57 crc kubenswrapper[4764]: I0320 14:51:57.380915 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 14:51:58 crc kubenswrapper[4764]: I0320 14:51:58.058641 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:58 crc kubenswrapper[4764]: I0320 14:51:58.469517 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:51:58 crc kubenswrapper[4764]: I0320 14:51:58.469732 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:51:58 crc kubenswrapper[4764]: I0320 14:51:58.470849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:51:58 crc kubenswrapper[4764]: I0320 14:51:58.470880 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:51:58 crc kubenswrapper[4764]: I0320 14:51:58.470890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:51:58 crc kubenswrapper[4764]: I0320 14:51:58.471445 4764 scope.go:117] "RemoveContainer" containerID="3e8a5a879f6f288f7d7d9bb0f1381f8768f97988dd145e1f192bbbd1a59ac312" Mar 20 14:51:58 crc kubenswrapper[4764]: E0320 14:51:58.471609 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:51:58 crc kubenswrapper[4764]: W0320 14:51:58.590106 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 14:51:58 crc kubenswrapper[4764]: E0320 14:51:58.590208 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 14:51:59 crc kubenswrapper[4764]: I0320 14:51:59.062249 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:51:59 crc kubenswrapper[4764]: E0320 14:51:59.243868 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.059599 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.185000 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.185341 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.187451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.187533 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.187553 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.193659 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.394323 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.394523 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.396123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.396209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.396231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.573146 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.574874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.574918 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.574934 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:00 crc kubenswrapper[4764]: I0320 14:52:00.574969 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:52:00 crc kubenswrapper[4764]: E0320 14:52:00.584020 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 14:52:00 crc kubenswrapper[4764]: E0320 14:52:00.584184 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 14:52:01 crc kubenswrapper[4764]: I0320 14:52:01.065540 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:01 crc kubenswrapper[4764]: W0320 14:52:01.337877 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 14:52:01 crc kubenswrapper[4764]: E0320 14:52:01.338194 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 14:52:01 crc kubenswrapper[4764]: I0320 14:52:01.396267 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:01 crc kubenswrapper[4764]: I0320 14:52:01.397197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:01 crc kubenswrapper[4764]: I0320 14:52:01.397311 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:01 crc kubenswrapper[4764]: I0320 14:52:01.397379 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:01 crc kubenswrapper[4764]: W0320 14:52:01.869204 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:01 crc kubenswrapper[4764]: E0320 14:52:01.869270 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 14:52:02 crc kubenswrapper[4764]: I0320 14:52:02.058583 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:02 crc kubenswrapper[4764]: I0320 14:52:02.964601 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:52:02 crc kubenswrapper[4764]: I0320 14:52:02.964920 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:02 crc kubenswrapper[4764]: I0320 14:52:02.966741 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:02 crc kubenswrapper[4764]: I0320 14:52:02.966819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:02 crc kubenswrapper[4764]: I0320 14:52:02.966840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:02 crc kubenswrapper[4764]: I0320 14:52:02.967764 4764 scope.go:117] "RemoveContainer" containerID="3e8a5a879f6f288f7d7d9bb0f1381f8768f97988dd145e1f192bbbd1a59ac312" Mar 20 14:52:02 crc kubenswrapper[4764]: E0320 14:52:02.968021 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:52:03 crc kubenswrapper[4764]: I0320 14:52:03.061999 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:04 crc kubenswrapper[4764]: I0320 14:52:04.058132 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:05 crc kubenswrapper[4764]: I0320 14:52:05.054741 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:06 crc kubenswrapper[4764]: I0320 14:52:06.060242 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:07 crc kubenswrapper[4764]: I0320 14:52:07.060186 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:07 crc kubenswrapper[4764]: I0320 14:52:07.584718 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:07 crc kubenswrapper[4764]: I0320 14:52:07.587838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:07 crc kubenswrapper[4764]: I0320 14:52:07.587913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:07 crc kubenswrapper[4764]: I0320 14:52:07.587937 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:07 crc kubenswrapper[4764]: I0320 14:52:07.587985 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:52:07 crc kubenswrapper[4764]: E0320 14:52:07.595503 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 14:52:07 crc kubenswrapper[4764]: E0320 14:52:07.596559 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 14:52:08 crc kubenswrapper[4764]: I0320 14:52:08.057985 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:08 crc kubenswrapper[4764]: W0320 14:52:08.711672 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 14:52:08 crc kubenswrapper[4764]: E0320 14:52:08.711756 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 14:52:09 crc kubenswrapper[4764]: I0320 14:52:09.059996 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:09 crc kubenswrapper[4764]: E0320 14:52:09.244666 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 14:52:10 crc kubenswrapper[4764]: I0320 14:52:10.059165 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:11 crc kubenswrapper[4764]: I0320 14:52:11.060746 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:11 crc kubenswrapper[4764]: I0320 14:52:11.660864 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:52:11 crc kubenswrapper[4764]: I0320 14:52:11.661105 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:11 crc kubenswrapper[4764]: I0320 14:52:11.663003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:11 crc kubenswrapper[4764]: I0320 14:52:11.663073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:11 crc kubenswrapper[4764]: I0320 14:52:11.663099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:12 crc kubenswrapper[4764]: I0320 14:52:12.060065 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:13 crc kubenswrapper[4764]: I0320 14:52:13.060070 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:13 crc kubenswrapper[4764]: I0320 14:52:13.585569 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:52:13 crc kubenswrapper[4764]: I0320 14:52:13.585841 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:13 crc kubenswrapper[4764]: I0320 14:52:13.587619 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:13 crc kubenswrapper[4764]: I0320 14:52:13.587690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:13 crc kubenswrapper[4764]: I0320 14:52:13.587709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:14 crc kubenswrapper[4764]: I0320 14:52:14.058035 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:14 crc kubenswrapper[4764]: I0320 14:52:14.596452 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:14 crc kubenswrapper[4764]: I0320 14:52:14.602291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:14 crc kubenswrapper[4764]: I0320 14:52:14.603796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:14 crc kubenswrapper[4764]: I0320 14:52:14.603846 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:14 crc kubenswrapper[4764]: I0320 14:52:14.603902 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:52:14 crc kubenswrapper[4764]: E0320 14:52:14.610455 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 14:52:14 crc kubenswrapper[4764]: E0320 14:52:14.610543 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 14:52:15 crc kubenswrapper[4764]: I0320 14:52:15.058622 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:15 crc kubenswrapper[4764]: I0320 14:52:15.125521 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:15 crc kubenswrapper[4764]: I0320 14:52:15.127269 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:15 crc kubenswrapper[4764]: I0320 14:52:15.127360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:15 crc kubenswrapper[4764]: I0320 14:52:15.127445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:15 crc kubenswrapper[4764]: I0320 14:52:15.128330 4764 scope.go:117] "RemoveContainer" containerID="3e8a5a879f6f288f7d7d9bb0f1381f8768f97988dd145e1f192bbbd1a59ac312" Mar 20 14:52:15 crc kubenswrapper[4764]: E0320 14:52:15.128692 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:52:16 crc kubenswrapper[4764]: I0320 14:52:16.059941 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:17 crc kubenswrapper[4764]: I0320 14:52:17.059065 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:18 crc kubenswrapper[4764]: I0320 14:52:18.061194 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:19 crc kubenswrapper[4764]: I0320 14:52:19.059672 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:19 crc kubenswrapper[4764]: E0320 14:52:19.245693 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 14:52:20 crc kubenswrapper[4764]: I0320 14:52:20.056052 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:21 crc kubenswrapper[4764]: I0320 14:52:21.061215 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:21 crc kubenswrapper[4764]: I0320 14:52:21.611060 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:21 crc kubenswrapper[4764]: I0320 14:52:21.612343 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:21 crc kubenswrapper[4764]: I0320 14:52:21.612375 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:21 crc kubenswrapper[4764]: I0320 14:52:21.612406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:21 crc kubenswrapper[4764]: I0320 14:52:21.612435 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:52:21 crc kubenswrapper[4764]: E0320 14:52:21.619340 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 14:52:21 crc kubenswrapper[4764]: E0320 14:52:21.619341 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 14:52:22 crc kubenswrapper[4764]: I0320 14:52:22.064168 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:23 crc kubenswrapper[4764]: I0320 14:52:23.056800 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 14:52:23 crc kubenswrapper[4764]: I0320 14:52:23.874785 4764 csr.go:261] certificate signing request csr-c6gb4 is approved, waiting to be issued Mar 20 14:52:23 crc kubenswrapper[4764]: I0320 14:52:23.882608 4764 csr.go:257] certificate signing request csr-c6gb4 is issued Mar 20 14:52:23 crc kubenswrapper[4764]: I0320 14:52:23.902786 4764 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 14:52:23 crc kubenswrapper[4764]: I0320 14:52:23.957585 4764 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 14:52:24 crc kubenswrapper[4764]: I0320 14:52:24.883854 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-21 05:32:47.561514764 +0000 UTC Mar 20 14:52:24 crc kubenswrapper[4764]: I0320 14:52:24.883922 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6614h40m22.677598248s for next certificate rotation Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.619749 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.622018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.622089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.622111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.622375 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.630065 4764 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.630547 4764 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 14:52:28 crc kubenswrapper[4764]: E0320 14:52:28.630593 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.635024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.635093 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.635112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.635142 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.635164 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:28Z","lastTransitionTime":"2026-03-20T14:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:28 crc kubenswrapper[4764]: E0320 14:52:28.652928 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.664112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.664201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.664224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.664258 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.664281 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:28Z","lastTransitionTime":"2026-03-20T14:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:28 crc kubenswrapper[4764]: E0320 14:52:28.679135 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.689502 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.689568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.689591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.689617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.689634 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:28Z","lastTransitionTime":"2026-03-20T14:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:28 crc kubenswrapper[4764]: E0320 14:52:28.700908 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.708947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.709005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.709019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.709042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:28 crc kubenswrapper[4764]: I0320 14:52:28.709055 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:28Z","lastTransitionTime":"2026-03-20T14:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:28 crc kubenswrapper[4764]: E0320 14:52:28.721153 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:28 crc kubenswrapper[4764]: E0320 14:52:28.721457 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 14:52:28 crc kubenswrapper[4764]: E0320 14:52:28.721513 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:28 crc kubenswrapper[4764]: E0320 14:52:28.822632 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:28 crc kubenswrapper[4764]: E0320 14:52:28.923119 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:29 crc kubenswrapper[4764]: E0320 14:52:29.024159 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:29 crc kubenswrapper[4764]: E0320 14:52:29.124938 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:29 crc kubenswrapper[4764]: I0320 14:52:29.125373 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:29 crc kubenswrapper[4764]: I0320 14:52:29.127491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:29 crc kubenswrapper[4764]: I0320 14:52:29.127555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:29 crc kubenswrapper[4764]: I0320 14:52:29.127577 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:29 crc kubenswrapper[4764]: I0320 14:52:29.128659 4764 scope.go:117] "RemoveContainer" containerID="3e8a5a879f6f288f7d7d9bb0f1381f8768f97988dd145e1f192bbbd1a59ac312" Mar 20 14:52:29 crc kubenswrapper[4764]: E0320 14:52:29.225443 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:29 crc kubenswrapper[4764]: E0320 14:52:29.246873 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 14:52:29 crc kubenswrapper[4764]: E0320 14:52:29.325554 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:29 crc kubenswrapper[4764]: E0320 14:52:29.426690 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:29 crc kubenswrapper[4764]: I0320 14:52:29.516752 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 14:52:29 crc kubenswrapper[4764]: I0320 14:52:29.519376 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666"} Mar 20 14:52:29 crc kubenswrapper[4764]: I0320 14:52:29.519637 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:29 crc kubenswrapper[4764]: I0320 14:52:29.521372 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:29 crc kubenswrapper[4764]: I0320 14:52:29.521432 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:29 crc kubenswrapper[4764]: I0320 14:52:29.521454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:29 crc kubenswrapper[4764]: E0320 14:52:29.527910 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:29 crc kubenswrapper[4764]: E0320 14:52:29.628037 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:29 crc kubenswrapper[4764]: E0320 14:52:29.729103 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:29 crc kubenswrapper[4764]: E0320 14:52:29.830058 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:29 crc kubenswrapper[4764]: E0320 14:52:29.930298 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:30 crc kubenswrapper[4764]: E0320 14:52:30.031429 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:30 crc kubenswrapper[4764]: E0320 14:52:30.131980 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:30 crc kubenswrapper[4764]: E0320 14:52:30.232897 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:30 crc kubenswrapper[4764]: E0320 14:52:30.333931 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:30 crc kubenswrapper[4764]: E0320 14:52:30.435002 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:30 crc kubenswrapper[4764]: I0320 14:52:30.523728 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 14:52:30 crc kubenswrapper[4764]: I0320 14:52:30.524312 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 14:52:30 crc kubenswrapper[4764]: I0320 14:52:30.525824 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666" exitCode=255 Mar 20 14:52:30 crc kubenswrapper[4764]: I0320 14:52:30.525861 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666"} Mar 20 14:52:30 crc kubenswrapper[4764]: I0320 14:52:30.525897 4764 scope.go:117] "RemoveContainer" containerID="3e8a5a879f6f288f7d7d9bb0f1381f8768f97988dd145e1f192bbbd1a59ac312" Mar 20 14:52:30 crc kubenswrapper[4764]: I0320 14:52:30.526056 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:30 crc kubenswrapper[4764]: I0320 14:52:30.530893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:30 crc kubenswrapper[4764]: I0320 14:52:30.531005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:30 crc kubenswrapper[4764]: I0320 14:52:30.531049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:30 crc kubenswrapper[4764]: I0320 14:52:30.532296 4764 scope.go:117] "RemoveContainer" containerID="ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666" Mar 20 14:52:30 crc kubenswrapper[4764]: E0320 14:52:30.532746 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:52:30 crc kubenswrapper[4764]: E0320 14:52:30.535935 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:30 crc kubenswrapper[4764]: E0320 14:52:30.636901 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:30 crc kubenswrapper[4764]: E0320 14:52:30.737829 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:30 crc kubenswrapper[4764]: E0320 14:52:30.838675 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:30 crc kubenswrapper[4764]: E0320 14:52:30.939084 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:31 crc kubenswrapper[4764]: E0320 14:52:31.039480 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:31 crc kubenswrapper[4764]: E0320 14:52:31.140646 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:31 crc kubenswrapper[4764]: E0320 14:52:31.241069 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:31 crc kubenswrapper[4764]: E0320 14:52:31.341416 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:31 crc kubenswrapper[4764]: E0320 14:52:31.442056 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:31 crc kubenswrapper[4764]: I0320 14:52:31.532175 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 14:52:31 crc kubenswrapper[4764]: E0320 14:52:31.542464 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:31 crc kubenswrapper[4764]: E0320 14:52:31.643662 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:31 crc kubenswrapper[4764]: E0320 14:52:31.744861 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:31 crc kubenswrapper[4764]: E0320 14:52:31.845065 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:31 crc kubenswrapper[4764]: E0320 14:52:31.945714 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:32 crc kubenswrapper[4764]: E0320 14:52:32.046025 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:32 crc kubenswrapper[4764]: E0320 14:52:32.146497 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:32 crc kubenswrapper[4764]: E0320 14:52:32.247335 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:32 crc kubenswrapper[4764]: E0320 14:52:32.348537 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:32 crc kubenswrapper[4764]: E0320 14:52:32.448730 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:32 crc kubenswrapper[4764]: E0320 14:52:32.549263 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:32 crc kubenswrapper[4764]: E0320 14:52:32.649432 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:32 crc kubenswrapper[4764]: E0320 14:52:32.749944 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:32 crc kubenswrapper[4764]: E0320 14:52:32.851105 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:32 crc kubenswrapper[4764]: E0320 14:52:32.951526 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:32 crc kubenswrapper[4764]: I0320 14:52:32.964110 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:52:32 crc kubenswrapper[4764]: I0320 14:52:32.964405 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:32 crc kubenswrapper[4764]: I0320 14:52:32.966114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:32 crc kubenswrapper[4764]: I0320 14:52:32.966182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:32 crc kubenswrapper[4764]: I0320 14:52:32.966208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:32 crc kubenswrapper[4764]: I0320 14:52:32.967304 4764 scope.go:117] "RemoveContainer" containerID="ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666" Mar 20 14:52:32 crc kubenswrapper[4764]: E0320 14:52:32.967730 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:52:33 crc kubenswrapper[4764]: E0320 14:52:33.052152 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:33 crc kubenswrapper[4764]: E0320 14:52:33.152722 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:33 crc kubenswrapper[4764]: E0320 14:52:33.253944 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:33 crc kubenswrapper[4764]: E0320 14:52:33.354154 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:33 crc kubenswrapper[4764]: E0320 14:52:33.455321 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:33 crc kubenswrapper[4764]: E0320 14:52:33.555736 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:33 crc kubenswrapper[4764]: E0320 14:52:33.656180 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:33 crc kubenswrapper[4764]: E0320 14:52:33.756364 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:33 crc kubenswrapper[4764]: E0320 14:52:33.857516 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:33 crc kubenswrapper[4764]: E0320 14:52:33.957724 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:34 crc kubenswrapper[4764]: E0320 14:52:34.058563 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:34 crc kubenswrapper[4764]: E0320 14:52:34.158775 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:34 crc kubenswrapper[4764]: E0320 14:52:34.259974 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:34 crc kubenswrapper[4764]: E0320 14:52:34.360851 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:34 crc kubenswrapper[4764]: E0320 14:52:34.461474 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:34 crc kubenswrapper[4764]: E0320 14:52:34.561640 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:34 crc kubenswrapper[4764]: E0320 14:52:34.662287 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:34 crc kubenswrapper[4764]: E0320 14:52:34.762779 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:34 crc kubenswrapper[4764]: E0320 14:52:34.863196 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:34 crc kubenswrapper[4764]: E0320 14:52:34.963355 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:35 crc kubenswrapper[4764]: E0320 14:52:35.064491 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:35 crc kubenswrapper[4764]: E0320 14:52:35.165629 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:35 crc kubenswrapper[4764]: E0320 14:52:35.266563 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:35 crc kubenswrapper[4764]: E0320 14:52:35.366981 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:35 crc kubenswrapper[4764]: E0320 14:52:35.467498 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:35 crc kubenswrapper[4764]: E0320 14:52:35.568327 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:35 crc kubenswrapper[4764]: E0320 14:52:35.669514 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:35 crc kubenswrapper[4764]: E0320 14:52:35.770629 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:35 crc kubenswrapper[4764]: E0320 14:52:35.871207 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:35 crc kubenswrapper[4764]: E0320 14:52:35.971942 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:36 crc kubenswrapper[4764]: E0320 14:52:36.072643 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:36 crc kubenswrapper[4764]: I0320 14:52:36.125984 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 14:52:36 crc kubenswrapper[4764]: I0320 14:52:36.127836 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:36 crc kubenswrapper[4764]: I0320 14:52:36.127908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:36 crc kubenswrapper[4764]: I0320 14:52:36.127928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:36 crc kubenswrapper[4764]: E0320 14:52:36.173328 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:36 crc kubenswrapper[4764]: E0320 14:52:36.274051 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:36 crc kubenswrapper[4764]: E0320 14:52:36.374584 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:36 crc kubenswrapper[4764]: E0320 14:52:36.475056 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:36 crc kubenswrapper[4764]: E0320 14:52:36.575479 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:36 crc kubenswrapper[4764]: E0320 14:52:36.675941 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:36 crc kubenswrapper[4764]: E0320 14:52:36.777087 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:36 crc kubenswrapper[4764]: E0320 14:52:36.878110 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:36 crc kubenswrapper[4764]: E0320 14:52:36.979198 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:37 crc kubenswrapper[4764]: E0320 14:52:37.080796 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:37 crc kubenswrapper[4764]: E0320 14:52:37.181048 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:37 crc kubenswrapper[4764]: E0320 14:52:37.282259 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:37 crc kubenswrapper[4764]: E0320 14:52:37.383294 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:37 crc kubenswrapper[4764]: E0320 14:52:37.483999 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:37 crc kubenswrapper[4764]: E0320 14:52:37.584489 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:37 crc kubenswrapper[4764]: E0320 14:52:37.685147 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:37 crc kubenswrapper[4764]: E0320 14:52:37.787323 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 14:52:37 crc kubenswrapper[4764]: I0320 14:52:37.798803 4764 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 14:52:37 crc kubenswrapper[4764]: I0320 14:52:37.891350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:37 crc kubenswrapper[4764]: I0320 14:52:37.891453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:37 crc kubenswrapper[4764]: I0320 14:52:37.891482 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:37 crc kubenswrapper[4764]: I0320 14:52:37.891517 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:37 crc kubenswrapper[4764]: I0320 14:52:37.891540 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:37Z","lastTransitionTime":"2026-03-20T14:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:37 crc kubenswrapper[4764]: I0320 14:52:37.979728 4764 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 14:52:37 crc kubenswrapper[4764]: I0320 14:52:37.995353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:37 crc kubenswrapper[4764]: I0320 14:52:37.995650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:37 crc kubenswrapper[4764]: I0320 14:52:37.995824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:37 crc kubenswrapper[4764]: I0320 14:52:37.995983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:37 crc kubenswrapper[4764]: I0320 14:52:37.996112 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:37Z","lastTransitionTime":"2026-03-20T14:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.085928 4764 apiserver.go:52] "Watching apiserver" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.097101 4764 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.098374 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-d4m5r","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-multus/network-metrics-daemon-fb2k7","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh","openshift-dns/node-resolver-8nwvm","openshift-multus/multus-additional-cni-plugins-279gq","openshift-ovn-kubernetes/ovnkube-node-p5lds","openshift-image-registry/node-ca-7kvvv","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-machine-config-operator/machine-config-daemon-6wln5","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.099355 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.099417 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.099516 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.100067 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.100132 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.101100 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.101628 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.103539 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.103820 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.103986 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.106482 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.106802 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7kvvv" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.106847 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.106913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.106931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.106942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.107043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.107062 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:38Z","lastTransitionTime":"2026-03-20T14:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.107125 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.107181 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8nwvm" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.108081 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.108227 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.108681 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.109522 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.110195 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.110410 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.110547 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.110610 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.110624 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.110707 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.110738 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.110757 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.110819 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.111829 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.112078 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.112256 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.112532 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.112834 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.112962 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.113013 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.119203 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.120040 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.120240 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.120458 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.120286 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.120569 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.120603 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.120711 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.120746 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.121007 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.121146 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.121204 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.121871 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.121890 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.121883 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.122139 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.122327 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.122525 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.122711 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.123624 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.133786 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.149117 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.157871 4764 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.161923 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164478 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164527 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164563 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164589 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164621 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164648 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164674 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164700 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164727 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164750 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164775 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164798 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164819 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164839 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164871 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164895 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164919 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164945 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164968 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.164991 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165015 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165040 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165066 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165090 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165113 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165137 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165161 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165186 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165211 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165254 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165281 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165304 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165330 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165355 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165411 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165442 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165467 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165490 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165513 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165536 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165573 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165606 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165666 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165697 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165729 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165759 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165781 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165807 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165829 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165859 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165881 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165904 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165928 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165951 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.165975 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166000 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166023 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166049 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166079 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166114 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166151 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166180 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166207 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166230 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166255 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166277 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166299 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166325 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166348 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166373 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166421 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166443 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166466 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166490 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166512 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166535 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166561 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166589 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166613 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166637 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166660 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166683 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166708 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166732 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166758 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166782 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166806 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166829 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166852 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166874 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166898 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166926 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.166976 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167001 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167040 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167067 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167089 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167112 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167138 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167190 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167213 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167239 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167262 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167290 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167316 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167372 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167514 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167575 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167604 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167658 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167686 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167729 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167782 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167843 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167914 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167936 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.167983 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168027 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168066 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168105 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168145 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168186 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168227 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168270 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168312 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168354 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168420 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168495 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168534 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168571 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168608 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168657 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168692 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168727 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168784 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168822 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168998 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169038 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169074 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169108 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169147 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169185 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169259 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169301 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169344 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169405 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169446 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169481 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169517 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169553 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169591 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169628 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169682 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169723 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169763 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169839 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169872 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169948 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169982 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170018 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170058 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170095 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170130 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170164 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170199 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170233 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170274 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170321 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170366 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170450 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170503 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170550 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170590 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170693 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170776 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170819 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170899 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.170987 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.171074 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.171152 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.171227 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.171275 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.171408 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.171455 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.171494 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.171530 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.171611 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.171908 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.171997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.172079 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.172277 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.172359 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skbfv\" (UniqueName: \"kubernetes.io/projected/cf5cd911-963e-480f-8bc2-6be581e6d9e5-kube-api-access-skbfv\") pod \"machine-config-daemon-6wln5\" (UID: \"cf5cd911-963e-480f-8bc2-6be581e6d9e5\") " pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.172462 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.172544 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-var-lib-kubelet\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.172622 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-run-multus-certs\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.172663 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-system-cni-dir\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.172743 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-log-socket\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.172820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpd8f\" (UniqueName: \"kubernetes.io/projected/84d09626-92c9-4c82-8dac-959885a658ca-kube-api-access-kpd8f\") pod \"ovnkube-control-plane-749d76644c-wczhh\" (UID: \"84d09626-92c9-4c82-8dac-959885a658ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.172893 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-run-netns\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.172928 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173016 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173102 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-tuning-conf-dir\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173180 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-env-overrides\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173261 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s727g\" (UniqueName: \"kubernetes.io/projected/f2a6c163-0457-4626-9bbb-5628a5155673-kube-api-access-s727g\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173309 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173492 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173568 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cf737ac-eb6b-499e-aa94-a37f8ced743b-host\") pod \"node-ca-7kvvv\" (UID: \"7cf737ac-eb6b-499e-aa94-a37f8ced743b\") " pod="openshift-image-registry/node-ca-7kvvv" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173647 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-openvswitch\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173695 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-ovn\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173746 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-cni-binary-copy\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173788 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q82f6\" (UniqueName: \"kubernetes.io/projected/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-kube-api-access-q82f6\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173824 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-run-netns\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173860 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-ovnkube-config\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173898 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-ovnkube-script-lib\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173932 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/db133590-6855-4e67-92cd-353b342f66fe-hosts-file\") pod \"node-resolver-8nwvm\" (UID: \"db133590-6855-4e67-92cd-353b342f66fe\") " pod="openshift-dns/node-resolver-8nwvm" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.173967 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-multus-socket-dir-parent\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.174004 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-cni-bin\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.174060 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.174097 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-cnibin\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.174132 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-slash\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.174166 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2sx6\" (UniqueName: \"kubernetes.io/projected/db133590-6855-4e67-92cd-353b342f66fe-kube-api-access-m2sx6\") pod \"node-resolver-8nwvm\" (UID: \"db133590-6855-4e67-92cd-353b342f66fe\") " pod="openshift-dns/node-resolver-8nwvm" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.174251 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcnvw\" (UniqueName: \"kubernetes.io/projected/4c881e2f-a84e-4621-9e1e-f2197d698a63-kube-api-access-lcnvw\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.174619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.174707 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-os-release\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.174805 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-systemd-units\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.174895 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.174980 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-kubelet\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175066 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-cni-netd\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175152 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175195 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf5cd911-963e-480f-8bc2-6be581e6d9e5-proxy-tls\") pod \"machine-config-daemon-6wln5\" (UID: \"cf5cd911-963e-480f-8bc2-6be581e6d9e5\") " pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175346 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-etc-openvswitch\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175576 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175658 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7cf737ac-eb6b-499e-aa94-a37f8ced743b-serviceca\") pod \"node-ca-7kvvv\" (UID: \"7cf737ac-eb6b-499e-aa94-a37f8ced743b\") " pod="openshift-image-registry/node-ca-7kvvv" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-hostroot\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175808 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-cnibin\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175844 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1f85a77d-475e-43c9-8181-093451bc058f-multus-daemon-config\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht7ds\" (UniqueName: \"kubernetes.io/projected/1f85a77d-475e-43c9-8181-093451bc058f-kube-api-access-ht7ds\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-multus-cni-dir\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176039 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-run-ovn-kubernetes\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176124 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176209 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-var-lib-cni-bin\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176283 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-var-lib-openvswitch\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176321 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84d09626-92c9-4c82-8dac-959885a658ca-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wczhh\" (UID: \"84d09626-92c9-4c82-8dac-959885a658ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176432 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84d09626-92c9-4c82-8dac-959885a658ca-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wczhh\" (UID: \"84d09626-92c9-4c82-8dac-959885a658ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176467 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-system-cni-dir\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176500 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-var-lib-cni-multus\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176535 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-os-release\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176572 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f85a77d-475e-43c9-8181-093451bc058f-cni-binary-copy\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176626 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176670 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176708 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf5cd911-963e-480f-8bc2-6be581e6d9e5-mcd-auth-proxy-config\") pod \"machine-config-daemon-6wln5\" (UID: \"cf5cd911-963e-480f-8bc2-6be581e6d9e5\") " pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176742 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f2a6c163-0457-4626-9bbb-5628a5155673-ovn-node-metrics-cert\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176778 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-run-k8s-cni-cncf-io\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-etc-kubernetes\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176849 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-multus-conf-dir\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176894 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176934 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cf5cd911-963e-480f-8bc2-6be581e6d9e5-rootfs\") pod \"machine-config-daemon-6wln5\" (UID: \"cf5cd911-963e-480f-8bc2-6be581e6d9e5\") " pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176972 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-systemd\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.177006 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-node-log\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.177046 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84d09626-92c9-4c82-8dac-959885a658ca-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wczhh\" (UID: \"84d09626-92c9-4c82-8dac-959885a658ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.177081 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h49t2\" (UniqueName: \"kubernetes.io/projected/7cf737ac-eb6b-499e-aa94-a37f8ced743b-kube-api-access-h49t2\") pod \"node-ca-7kvvv\" (UID: \"7cf737ac-eb6b-499e-aa94-a37f8ced743b\") " pod="openshift-image-registry/node-ca-7kvvv" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.177172 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.177201 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168270 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.182775 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168242 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.168520 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.169157 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.174590 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.175061 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176609 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.176946 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.177063 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.177536 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.178557 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.178974 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.179064 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.179496 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.179638 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.179790 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.180009 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.180083 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.180107 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.179896 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.181456 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.181648 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.181677 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.181689 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.181700 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.181995 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.182114 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.182636 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.182704 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.182733 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.182760 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.182791 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.182821 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.183090 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.183272 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.183349 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.183912 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.184078 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.184610 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.185129 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.185153 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.185188 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.185296 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.185555 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.185564 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.185775 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.185367 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.185964 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.187854 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.188073 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.188199 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.189640 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.189745 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.190248 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.190359 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:38.689157531 +0000 UTC m=+80.305346700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.190347 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.190729 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.191357 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.191620 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.191755 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.191983 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.192064 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.192105 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.192152 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.194269 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.194289 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.194497 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.194691 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.194752 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.195106 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.195157 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.196026 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.195846 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.196691 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.196861 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.197100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.197610 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.198181 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.202291 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.202478 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.202984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.203053 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.203797 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.203934 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.205185 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.196709 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.206017 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.206343 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.206636 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.206762 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.206805 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.207492 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.207855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.207869 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.207934 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.207936 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.208662 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.205587 4764 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.210008 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.210144 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.210216 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.210335 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.210331 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.210258 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.211829 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.211999 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.212231 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.212411 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.212754 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.212928 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.212901 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.213474 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.213495 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.213942 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.213955 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.214460 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.214580 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.214498 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.214763 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.214841 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.214861 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.214898 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.215308 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.215329 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.215604 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:38.715580157 +0000 UTC m=+80.331769296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.215700 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.215740 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.216060 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:38.71601374 +0000 UTC m=+80.332202889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.216369 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.217563 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:52:38.717536426 +0000 UTC m=+80.333725575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.222099 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.222429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.222560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.222605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.222639 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.222660 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:38Z","lastTransitionTime":"2026-03-20T14:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.223325 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.223784 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.224488 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.224493 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.224595 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.225226 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.227225 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.233149 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.233239 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.233932 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.233964 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.233984 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.234064 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:38.734038598 +0000 UTC m=+80.350227737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.239612 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.245501 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.245821 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.245822 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.246748 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.246744 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.247469 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.247863 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.248034 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.248267 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.248351 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.248624 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.248344 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.247889 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.249035 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.249584 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.249650 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.249946 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.250291 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.250463 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.250894 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.250982 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.251152 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.251337 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.251548 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.251729 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.251668 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.251800 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.251931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.255428 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.255457 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.256012 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.262642 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.267053 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.268505 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.269047 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.269131 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.269087 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.270574 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.270915 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.271142 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.271202 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.271598 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.271832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.271859 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.272261 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.272445 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.272694 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.272907 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.272924 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.273080 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.273115 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.273408 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.273431 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.273581 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.274160 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.274328 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.274452 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.274506 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.274856 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.272266 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.275185 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.275553 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.275690 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.275910 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.276648 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.276754 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.276828 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.277443 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278144 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-slash\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278210 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2sx6\" (UniqueName: \"kubernetes.io/projected/db133590-6855-4e67-92cd-353b342f66fe-kube-api-access-m2sx6\") pod \"node-resolver-8nwvm\" (UID: \"db133590-6855-4e67-92cd-353b342f66fe\") " pod="openshift-dns/node-resolver-8nwvm" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278262 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcnvw\" (UniqueName: \"kubernetes.io/projected/4c881e2f-a84e-4621-9e1e-f2197d698a63-kube-api-access-lcnvw\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278315 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278365 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-os-release\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278425 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-systemd-units\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278462 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-kubelet\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-cni-netd\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278551 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7cf737ac-eb6b-499e-aa94-a37f8ced743b-serviceca\") pod \"node-ca-7kvvv\" (UID: \"7cf737ac-eb6b-499e-aa94-a37f8ced743b\") " pod="openshift-image-registry/node-ca-7kvvv" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278584 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-hostroot\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278617 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278650 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf5cd911-963e-480f-8bc2-6be581e6d9e5-proxy-tls\") pod \"machine-config-daemon-6wln5\" (UID: \"cf5cd911-963e-480f-8bc2-6be581e6d9e5\") " pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278686 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-etc-openvswitch\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278724 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht7ds\" (UniqueName: \"kubernetes.io/projected/1f85a77d-475e-43c9-8181-093451bc058f-kube-api-access-ht7ds\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-cnibin\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278794 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1f85a77d-475e-43c9-8181-093451bc058f-multus-daemon-config\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278832 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-run-ovn-kubernetes\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-multus-cni-dir\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278929 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-var-lib-cni-bin\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278961 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-var-lib-cni-multus\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.278997 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-var-lib-openvswitch\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279038 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84d09626-92c9-4c82-8dac-959885a658ca-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wczhh\" (UID: \"84d09626-92c9-4c82-8dac-959885a658ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279073 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84d09626-92c9-4c82-8dac-959885a658ca-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wczhh\" (UID: \"84d09626-92c9-4c82-8dac-959885a658ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279106 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-system-cni-dir\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279148 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-os-release\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279180 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f85a77d-475e-43c9-8181-093451bc058f-cni-binary-copy\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279280 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f2a6c163-0457-4626-9bbb-5628a5155673-ovn-node-metrics-cert\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279319 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-run-k8s-cni-cncf-io\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-etc-kubernetes\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279616 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf5cd911-963e-480f-8bc2-6be581e6d9e5-mcd-auth-proxy-config\") pod \"machine-config-daemon-6wln5\" (UID: \"cf5cd911-963e-480f-8bc2-6be581e6d9e5\") " pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279655 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-node-log\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279688 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84d09626-92c9-4c82-8dac-959885a658ca-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wczhh\" (UID: \"84d09626-92c9-4c82-8dac-959885a658ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h49t2\" (UniqueName: \"kubernetes.io/projected/7cf737ac-eb6b-499e-aa94-a37f8ced743b-kube-api-access-h49t2\") pod \"node-ca-7kvvv\" (UID: \"7cf737ac-eb6b-499e-aa94-a37f8ced743b\") " pod="openshift-image-registry/node-ca-7kvvv" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279754 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-multus-conf-dir\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279789 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cf5cd911-963e-480f-8bc2-6be581e6d9e5-rootfs\") pod \"machine-config-daemon-6wln5\" (UID: \"cf5cd911-963e-480f-8bc2-6be581e6d9e5\") " pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279823 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-systemd\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-run-multus-certs\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279886 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279920 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skbfv\" (UniqueName: \"kubernetes.io/projected/cf5cd911-963e-480f-8bc2-6be581e6d9e5-kube-api-access-skbfv\") pod \"machine-config-daemon-6wln5\" (UID: \"cf5cd911-963e-480f-8bc2-6be581e6d9e5\") " pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.279969 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-var-lib-kubelet\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280006 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-system-cni-dir\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280072 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-log-socket\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280106 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpd8f\" (UniqueName: \"kubernetes.io/projected/84d09626-92c9-4c82-8dac-959885a658ca-kube-api-access-kpd8f\") pod \"ovnkube-control-plane-749d76644c-wczhh\" (UID: \"84d09626-92c9-4c82-8dac-959885a658ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280140 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-run-netns\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280170 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280177 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s727g\" (UniqueName: \"kubernetes.io/projected/f2a6c163-0457-4626-9bbb-5628a5155673-kube-api-access-s727g\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280237 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-tuning-conf-dir\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280265 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-env-overrides\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280288 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cf737ac-eb6b-499e-aa94-a37f8ced743b-host\") pod \"node-ca-7kvvv\" (UID: \"7cf737ac-eb6b-499e-aa94-a37f8ced743b\") " pod="openshift-image-registry/node-ca-7kvvv" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280335 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-openvswitch\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280356 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-ovn\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280388 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-ovnkube-script-lib\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280409 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/db133590-6855-4e67-92cd-353b342f66fe-hosts-file\") pod \"node-resolver-8nwvm\" (UID: \"db133590-6855-4e67-92cd-353b342f66fe\") " pod="openshift-dns/node-resolver-8nwvm" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280431 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-multus-socket-dir-parent\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-cni-binary-copy\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280472 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q82f6\" (UniqueName: \"kubernetes.io/projected/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-kube-api-access-q82f6\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280492 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-run-netns\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280512 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-ovnkube-config\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280530 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-cni-bin\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280550 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-cnibin\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280678 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-slash\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280780 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-systemd-units\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-kubelet\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280838 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-cni-netd\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.280856 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.281395 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-tuning-conf-dir\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.281794 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-run-netns\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.282183 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-var-lib-openvswitch\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.282287 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-multus-socket-dir-parent\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.282282 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-cni-bin\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.282726 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-cnibin\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.282368 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.282943 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-log-socket\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.283002 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-run-k8s-cni-cncf-io\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.283167 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.283209 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-var-lib-kubelet\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.283237 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs podName:4c881e2f-a84e-4621-9e1e-f2197d698a63 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:38.783215727 +0000 UTC m=+80.399404866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs") pod "network-metrics-daemon-fb2k7" (UID: "4c881e2f-a84e-4621-9e1e-f2197d698a63") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.283339 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-run-multus-certs\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.283112 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cf5cd911-963e-480f-8bc2-6be581e6d9e5-rootfs\") pod \"machine-config-daemon-6wln5\" (UID: \"cf5cd911-963e-480f-8bc2-6be581e6d9e5\") " pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.283689 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-multus-conf-dir\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.284868 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-etc-kubernetes\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.286469 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-systemd\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.286772 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84d09626-92c9-4c82-8dac-959885a658ca-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wczhh\" (UID: \"84d09626-92c9-4c82-8dac-959885a658ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.286832 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-node-log\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.287037 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84d09626-92c9-4c82-8dac-959885a658ca-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wczhh\" (UID: \"84d09626-92c9-4c82-8dac-959885a658ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.287123 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-env-overrides\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.287163 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-ovnkube-config\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.287170 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-system-cni-dir\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.287453 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-os-release\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.284637 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-hostroot\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.287803 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.287826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf5cd911-963e-480f-8bc2-6be581e6d9e5-mcd-auth-proxy-config\") pod \"machine-config-daemon-6wln5\" (UID: \"cf5cd911-963e-480f-8bc2-6be581e6d9e5\") " pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.288910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-etc-openvswitch\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.289318 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.289613 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-run-netns\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.291607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1f85a77d-475e-43c9-8181-093451bc058f-multus-daemon-config\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.291661 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-run-ovn-kubernetes\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.291782 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-multus-cni-dir\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.291814 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-var-lib-cni-bin\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.291949 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1f85a77d-475e-43c9-8181-093451bc058f-host-var-lib-cni-multus\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.292224 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-system-cni-dir\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.292942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-ovn\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.293128 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.293186 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-os-release\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.293234 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-openvswitch\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.293260 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7cf737ac-eb6b-499e-aa94-a37f8ced743b-host\") pod \"node-ca-7kvvv\" (UID: \"7cf737ac-eb6b-499e-aa94-a37f8ced743b\") " pod="openshift-image-registry/node-ca-7kvvv" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.289347 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-cnibin\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.293659 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/db133590-6855-4e67-92cd-353b342f66fe-hosts-file\") pod \"node-resolver-8nwvm\" (UID: \"db133590-6855-4e67-92cd-353b342f66fe\") " pod="openshift-dns/node-resolver-8nwvm" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.293993 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294289 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294311 4764 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294323 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294335 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294345 4764 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294356 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294368 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294391 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294404 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294419 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294432 4764 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294443 4764 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294453 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294464 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294475 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294485 4764 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294497 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294508 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294518 4764 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294528 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294539 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294569 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294579 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294589 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294602 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294613 4764 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294625 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294636 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294646 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294656 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294667 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294678 4764 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294687 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294697 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294708 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294718 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294729 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294740 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294751 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294762 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294772 4764 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294782 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294792 4764 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294803 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294813 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294823 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294833 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294846 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294859 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294870 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294881 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294891 4764 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294901 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294912 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294922 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294933 4764 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294944 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294954 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294964 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294977 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294988 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.294999 4764 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295010 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295021 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295032 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295043 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295055 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295065 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295075 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295085 4764 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295096 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295106 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295118 4764 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295129 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295139 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295149 4764 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295160 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295170 4764 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295179 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295189 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295198 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295209 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295220 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295232 4764 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295242 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295252 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295263 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295274 4764 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295284 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295294 4764 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295305 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295314 4764 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295324 4764 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295336 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295348 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295359 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295369 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295391 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295401 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295412 4764 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295421 4764 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.295431 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296219 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1f85a77d-475e-43c9-8181-093451bc058f-cni-binary-copy\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296284 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296302 4764 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296314 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296423 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296436 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296446 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296456 4764 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296466 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296582 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296592 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296601 4764 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296610 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296619 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296628 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296745 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296756 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296766 4764 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296775 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296785 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.296795 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297213 4764 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297222 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297230 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297242 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297355 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297645 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297662 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297672 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297682 4764 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297693 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297717 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297727 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297737 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297947 4764 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297962 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.297975 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.298619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-ovnkube-script-lib\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.298644 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.298805 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.298869 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.298929 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.298987 4764 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299047 4764 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299100 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299165 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299223 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299284 4764 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299338 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299461 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299527 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299585 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299641 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299701 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299762 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299818 4764 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299877 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299936 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.299990 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.300042 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.300140 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.300206 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.300266 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.300730 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.300869 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.300943 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301000 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301059 4764 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301196 4764 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301233 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301248 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301260 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301271 4764 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301283 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301284 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h49t2\" (UniqueName: \"kubernetes.io/projected/7cf737ac-eb6b-499e-aa94-a37f8ced743b-kube-api-access-h49t2\") pod \"node-ca-7kvvv\" (UID: \"7cf737ac-eb6b-499e-aa94-a37f8ced743b\") " pod="openshift-image-registry/node-ca-7kvvv" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301294 4764 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301422 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301442 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301457 4764 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301471 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301486 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301505 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301917 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301937 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301956 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301971 4764 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301984 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301997 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.301281 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-cni-binary-copy\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.302011 4764 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.302655 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f2a6c163-0457-4626-9bbb-5628a5155673-ovn-node-metrics-cert\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.302955 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.303022 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7cf737ac-eb6b-499e-aa94-a37f8ced743b-serviceca\") pod \"node-ca-7kvvv\" (UID: \"7cf737ac-eb6b-499e-aa94-a37f8ced743b\") " pod="openshift-image-registry/node-ca-7kvvv" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.304664 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht7ds\" (UniqueName: \"kubernetes.io/projected/1f85a77d-475e-43c9-8181-093451bc058f-kube-api-access-ht7ds\") pod \"multus-d4m5r\" (UID: \"1f85a77d-475e-43c9-8181-093451bc058f\") " pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.304775 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.306253 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84d09626-92c9-4c82-8dac-959885a658ca-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wczhh\" (UID: \"84d09626-92c9-4c82-8dac-959885a658ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.306919 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf5cd911-963e-480f-8bc2-6be581e6d9e5-proxy-tls\") pod \"machine-config-daemon-6wln5\" (UID: \"cf5cd911-963e-480f-8bc2-6be581e6d9e5\") " pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.311476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s727g\" (UniqueName: \"kubernetes.io/projected/f2a6c163-0457-4626-9bbb-5628a5155673-kube-api-access-s727g\") pod \"ovnkube-node-p5lds\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.314451 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skbfv\" (UniqueName: \"kubernetes.io/projected/cf5cd911-963e-480f-8bc2-6be581e6d9e5-kube-api-access-skbfv\") pod \"machine-config-daemon-6wln5\" (UID: \"cf5cd911-963e-480f-8bc2-6be581e6d9e5\") " pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.316746 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.317154 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q82f6\" (UniqueName: \"kubernetes.io/projected/07de6dd3-cfb3-49f7-9ac3-6c3a522ff349-kube-api-access-q82f6\") pod \"multus-additional-cni-plugins-279gq\" (UID: \"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\") " pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.317431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcnvw\" (UniqueName: \"kubernetes.io/projected/4c881e2f-a84e-4621-9e1e-f2197d698a63-kube-api-access-lcnvw\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.317638 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2sx6\" (UniqueName: \"kubernetes.io/projected/db133590-6855-4e67-92cd-353b342f66fe-kube-api-access-m2sx6\") pod \"node-resolver-8nwvm\" (UID: \"db133590-6855-4e67-92cd-353b342f66fe\") " pod="openshift-dns/node-resolver-8nwvm" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.321722 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.325760 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpd8f\" (UniqueName: \"kubernetes.io/projected/84d09626-92c9-4c82-8dac-959885a658ca-kube-api-access-kpd8f\") pod \"ovnkube-control-plane-749d76644c-wczhh\" (UID: \"84d09626-92c9-4c82-8dac-959885a658ca\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.325778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.325874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.325900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.325934 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.325959 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:38Z","lastTransitionTime":"2026-03-20T14:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.326868 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.330734 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.341520 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.353427 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.362192 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.372237 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.403944 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.404020 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.404041 4764 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.404067 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.426793 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.430273 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.430337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.430357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.430412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.430431 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:38Z","lastTransitionTime":"2026-03-20T14:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.434564 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.442543 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:52:38 crc kubenswrapper[4764]: W0320 14:52:38.446504 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-c8d232ebf9ce194135467681ef70e4f9910406b66265eb6408764bf487369876 WatchSource:0}: Error finding container c8d232ebf9ce194135467681ef70e4f9910406b66265eb6408764bf487369876: Status 404 returned error can't find the container with id c8d232ebf9ce194135467681ef70e4f9910406b66265eb6408764bf487369876 Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.452536 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 14:52:38 crc kubenswrapper[4764]: W0320 14:52:38.455932 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6b360699b0501ca5344f1cfa26f92dd9373276cb9dc01021087f03c5313a854b WatchSource:0}: Error finding container 6b360699b0501ca5344f1cfa26f92dd9373276cb9dc01021087f03c5313a854b: Status 404 returned error can't find the container with id 6b360699b0501ca5344f1cfa26f92dd9373276cb9dc01021087f03c5313a854b Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.456832 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 14:52:38 crc kubenswrapper[4764]: set -o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: source /etc/kubernetes/apiserver-url.env Mar 20 14:52:38 crc kubenswrapper[4764]: else Mar 20 14:52:38 crc kubenswrapper[4764]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 14:52:38 crc kubenswrapper[4764]: exit 1 Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.457977 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.464311 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.467417 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: set -o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: source "/env/_master" Mar 20 14:52:38 crc kubenswrapper[4764]: set +o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 14:52:38 crc kubenswrapper[4764]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 14:52:38 crc kubenswrapper[4764]: ho_enable="--enable-hybrid-overlay" Mar 20 14:52:38 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 14:52:38 crc kubenswrapper[4764]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 14:52:38 crc kubenswrapper[4764]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 14:52:38 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 14:52:38 crc kubenswrapper[4764]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --webhook-host=127.0.0.1 \ Mar 20 14:52:38 crc kubenswrapper[4764]: --webhook-port=9743 \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${ho_enable} \ Mar 20 14:52:38 crc kubenswrapper[4764]: --enable-interconnect \ Mar 20 14:52:38 crc kubenswrapper[4764]: --disable-approver \ Mar 20 14:52:38 crc kubenswrapper[4764]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --wait-for-kubernetes-api=200s \ Mar 20 14:52:38 crc kubenswrapper[4764]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: W0320 14:52:38.467607 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf5cd911_963e_480f_8bc2_6be581e6d9e5.slice/crio-be7af832d46e757d200a178c752f6d5e08288b99d1fa689bcbf9e567dfbecd7c WatchSource:0}: Error finding container be7af832d46e757d200a178c752f6d5e08288b99d1fa689bcbf9e567dfbecd7c: Status 404 returned error can't find the container with id be7af832d46e757d200a178c752f6d5e08288b99d1fa689bcbf9e567dfbecd7c Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.469516 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.476092 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-skbfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.476272 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: set -o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: source "/env/_master" Mar 20 14:52:38 crc kubenswrapper[4764]: set +o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 14:52:38 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 14:52:38 crc kubenswrapper[4764]: --disable-webhook \ Mar 20 14:52:38 crc kubenswrapper[4764]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.477581 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.479776 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-skbfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.481265 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.486275 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d4m5r" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.487626 4764 scope.go:117] "RemoveContainer" containerID="ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.487813 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.487874 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.493601 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 14:52:38 crc kubenswrapper[4764]: set -euo pipefail Mar 20 14:52:38 crc kubenswrapper[4764]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 14:52:38 crc kubenswrapper[4764]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 14:52:38 crc kubenswrapper[4764]: # As the secret mount is optional we must wait for the files to be present. Mar 20 14:52:38 crc kubenswrapper[4764]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 14:52:38 crc kubenswrapper[4764]: TS=$(date +%s) Mar 20 14:52:38 crc kubenswrapper[4764]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 14:52:38 crc kubenswrapper[4764]: HAS_LOGGED_INFO=0 Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: log_missing_certs(){ Mar 20 14:52:38 crc kubenswrapper[4764]: CUR_TS=$(date +%s) Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 14:52:38 crc kubenswrapper[4764]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 14:52:38 crc kubenswrapper[4764]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 14:52:38 crc kubenswrapper[4764]: HAS_LOGGED_INFO=1 Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: } Mar 20 14:52:38 crc kubenswrapper[4764]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 14:52:38 crc kubenswrapper[4764]: log_missing_certs Mar 20 14:52:38 crc kubenswrapper[4764]: sleep 5 Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 14:52:38 crc kubenswrapper[4764]: exec /usr/bin/kube-rbac-proxy \ Mar 20 14:52:38 crc kubenswrapper[4764]: --logtostderr \ Mar 20 14:52:38 crc kubenswrapper[4764]: --secure-listen-address=:9108 \ Mar 20 14:52:38 crc kubenswrapper[4764]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 14:52:38 crc kubenswrapper[4764]: --upstream=http://127.0.0.1:29108/ \ Mar 20 14:52:38 crc kubenswrapper[4764]: --tls-private-key-file=${TLS_PK} \ Mar 20 14:52:38 crc kubenswrapper[4764]: --tls-cert-file=${TLS_CERT} Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kpd8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-wczhh_openshift-ovn-kubernetes(84d09626-92c9-4c82-8dac-959885a658ca): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.496957 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: set -o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: source "/env/_master" Mar 20 14:52:38 crc kubenswrapper[4764]: set +o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v4_join_subnet_opt= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "" != "" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v6_join_subnet_opt= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "" != "" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v4_transit_switch_subnet_opt= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "" != "" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v6_transit_switch_subnet_opt= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "" != "" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: dns_name_resolver_enabled_flag= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "false" == "true" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: persistent_ips_enabled_flag= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "true" == "true" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: # This is needed so that converting clusters from GA to TP Mar 20 14:52:38 crc kubenswrapper[4764]: # will rollout control plane pods as well Mar 20 14:52:38 crc kubenswrapper[4764]: network_segmentation_enabled_flag= Mar 20 14:52:38 crc kubenswrapper[4764]: multi_network_enabled_flag= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "true" == "true" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: multi_network_enabled_flag="--enable-multi-network" Mar 20 14:52:38 crc kubenswrapper[4764]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 14:52:38 crc kubenswrapper[4764]: exec /usr/bin/ovnkube \ Mar 20 14:52:38 crc kubenswrapper[4764]: --enable-interconnect \ Mar 20 14:52:38 crc kubenswrapper[4764]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 14:52:38 crc kubenswrapper[4764]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --metrics-enable-pprof \ Mar 20 14:52:38 crc kubenswrapper[4764]: --metrics-enable-config-duration \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${ovn_v4_join_subnet_opt} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${ovn_v6_join_subnet_opt} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${dns_name_resolver_enabled_flag} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${persistent_ips_enabled_flag} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${multi_network_enabled_flag} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${network_segmentation_enabled_flag} Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kpd8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-wczhh_openshift-ovn-kubernetes(84d09626-92c9-4c82-8dac-959885a658ca): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.497216 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.498455 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.498535 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" podUID="84d09626-92c9-4c82-8dac-959885a658ca" Mar 20 14:52:38 crc kubenswrapper[4764]: W0320 14:52:38.510098 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f85a77d_475e_43c9_8181_093451bc058f.slice/crio-87d5d905639d86d58ee71e2b80a3d5f313f5db069c9d558fe5c8d77c2f198e29 WatchSource:0}: Error finding container 87d5d905639d86d58ee71e2b80a3d5f313f5db069c9d558fe5c8d77c2f198e29: Status 404 returned error can't find the container with id 87d5d905639d86d58ee71e2b80a3d5f313f5db069c9d558fe5c8d77c2f198e29 Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.511161 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7kvvv" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.514248 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 14:52:38 crc kubenswrapper[4764]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 14:52:38 crc kubenswrapper[4764]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ht7ds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-d4m5r_openshift-multus(1f85a77d-475e-43c9-8181-093451bc058f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.515861 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-d4m5r" podUID="1f85a77d-475e-43c9-8181-093451bc058f" Mar 20 14:52:38 crc kubenswrapper[4764]: W0320 14:52:38.528478 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cf737ac_eb6b_499e_aa94_a37f8ced743b.slice/crio-0eb506b6c62995ca4ee5bce8118b9f51537bffcb20422ddbe0398b01b4b9e1cb WatchSource:0}: Error finding container 0eb506b6c62995ca4ee5bce8118b9f51537bffcb20422ddbe0398b01b4b9e1cb: Status 404 returned error can't find the container with id 0eb506b6c62995ca4ee5bce8118b9f51537bffcb20422ddbe0398b01b4b9e1cb Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.532547 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 14:52:38 crc kubenswrapper[4764]: while [ true ]; Mar 20 14:52:38 crc kubenswrapper[4764]: do Mar 20 14:52:38 crc kubenswrapper[4764]: for f in $(ls /tmp/serviceca); do Mar 20 14:52:38 crc kubenswrapper[4764]: echo $f Mar 20 14:52:38 crc kubenswrapper[4764]: ca_file_path="/tmp/serviceca/${f}" Mar 20 14:52:38 crc kubenswrapper[4764]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 14:52:38 crc kubenswrapper[4764]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 14:52:38 crc kubenswrapper[4764]: if [ -e "${reg_dir_path}" ]; then Mar 20 14:52:38 crc kubenswrapper[4764]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 14:52:38 crc kubenswrapper[4764]: else Mar 20 14:52:38 crc kubenswrapper[4764]: mkdir $reg_dir_path Mar 20 14:52:38 crc kubenswrapper[4764]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: for d in $(ls /etc/docker/certs.d); do Mar 20 14:52:38 crc kubenswrapper[4764]: echo $d Mar 20 14:52:38 crc kubenswrapper[4764]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 14:52:38 crc kubenswrapper[4764]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 14:52:38 crc kubenswrapper[4764]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 14:52:38 crc kubenswrapper[4764]: rm -rf /etc/docker/certs.d/$d Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: sleep 60 & wait ${!} Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h49t2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-7kvvv_openshift-image-registry(7cf737ac-eb6b-499e-aa94-a37f8ced743b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.532645 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.532675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.532711 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.532730 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.532741 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:38Z","lastTransitionTime":"2026-03-20T14:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.533909 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-7kvvv" podUID="7cf737ac-eb6b-499e-aa94-a37f8ced743b" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.558314 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6b360699b0501ca5344f1cfa26f92dd9373276cb9dc01021087f03c5313a854b"} Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.560062 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: set -o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: source "/env/_master" Mar 20 14:52:38 crc kubenswrapper[4764]: set +o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 14:52:38 crc kubenswrapper[4764]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 14:52:38 crc kubenswrapper[4764]: ho_enable="--enable-hybrid-overlay" Mar 20 14:52:38 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 14:52:38 crc kubenswrapper[4764]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 14:52:38 crc kubenswrapper[4764]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 14:52:38 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 14:52:38 crc kubenswrapper[4764]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --webhook-host=127.0.0.1 \ Mar 20 14:52:38 crc kubenswrapper[4764]: --webhook-port=9743 \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${ho_enable} \ Mar 20 14:52:38 crc kubenswrapper[4764]: --enable-interconnect \ Mar 20 14:52:38 crc kubenswrapper[4764]: --disable-approver \ Mar 20 14:52:38 crc kubenswrapper[4764]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --wait-for-kubernetes-api=200s \ Mar 20 14:52:38 crc kubenswrapper[4764]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.560553 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c8d232ebf9ce194135467681ef70e4f9910406b66265eb6408764bf487369876"} Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.562230 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4m5r" event={"ID":"1f85a77d-475e-43c9-8181-093451bc058f","Type":"ContainerStarted","Data":"87d5d905639d86d58ee71e2b80a3d5f313f5db069c9d558fe5c8d77c2f198e29"} Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.565942 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 14:52:38 crc kubenswrapper[4764]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 14:52:38 crc kubenswrapper[4764]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ht7ds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-d4m5r_openshift-multus(1f85a77d-475e-43c9-8181-093451bc058f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.566245 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7kvvv" event={"ID":"7cf737ac-eb6b-499e-aa94-a37f8ced743b","Type":"ContainerStarted","Data":"0eb506b6c62995ca4ee5bce8118b9f51537bffcb20422ddbe0398b01b4b9e1cb"} Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.567805 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-d4m5r" podUID="1f85a77d-475e-43c9-8181-093451bc058f" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.568544 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-279gq" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.568740 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: set -o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: source "/env/_master" Mar 20 14:52:38 crc kubenswrapper[4764]: set +o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 14:52:38 crc kubenswrapper[4764]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 14:52:38 crc kubenswrapper[4764]: --disable-webhook \ Mar 20 14:52:38 crc kubenswrapper[4764]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --loglevel="${LOGLEVEL}" Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.569524 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" event={"ID":"84d09626-92c9-4c82-8dac-959885a658ca","Type":"ContainerStarted","Data":"f6fe2aaa784c3da1e43b85df8fefd7e0c3bc83746c9d4bbe9b84b59d31b2c548"} Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.569796 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 14:52:38 crc kubenswrapper[4764]: while [ true ]; Mar 20 14:52:38 crc kubenswrapper[4764]: do Mar 20 14:52:38 crc kubenswrapper[4764]: for f in $(ls /tmp/serviceca); do Mar 20 14:52:38 crc kubenswrapper[4764]: echo $f Mar 20 14:52:38 crc kubenswrapper[4764]: ca_file_path="/tmp/serviceca/${f}" Mar 20 14:52:38 crc kubenswrapper[4764]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 14:52:38 crc kubenswrapper[4764]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 14:52:38 crc kubenswrapper[4764]: if [ -e "${reg_dir_path}" ]; then Mar 20 14:52:38 crc kubenswrapper[4764]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 14:52:38 crc kubenswrapper[4764]: else Mar 20 14:52:38 crc kubenswrapper[4764]: mkdir $reg_dir_path Mar 20 14:52:38 crc kubenswrapper[4764]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: for d in $(ls /etc/docker/certs.d); do Mar 20 14:52:38 crc kubenswrapper[4764]: echo $d Mar 20 14:52:38 crc kubenswrapper[4764]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 14:52:38 crc kubenswrapper[4764]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 14:52:38 crc kubenswrapper[4764]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 14:52:38 crc kubenswrapper[4764]: rm -rf /etc/docker/certs.d/$d Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: sleep 60 & wait ${!} Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h49t2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-7kvvv_openshift-image-registry(7cf737ac-eb6b-499e-aa94-a37f8ced743b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.569882 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.570965 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-7kvvv" podUID="7cf737ac-eb6b-499e-aa94-a37f8ced743b" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.573133 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 14:52:38 crc kubenswrapper[4764]: set -o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: source /etc/kubernetes/apiserver-url.env Mar 20 14:52:38 crc kubenswrapper[4764]: else Mar 20 14:52:38 crc kubenswrapper[4764]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 14:52:38 crc kubenswrapper[4764]: exit 1 Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.573295 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 14:52:38 crc kubenswrapper[4764]: set -euo pipefail Mar 20 14:52:38 crc kubenswrapper[4764]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 14:52:38 crc kubenswrapper[4764]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 14:52:38 crc kubenswrapper[4764]: # As the secret mount is optional we must wait for the files to be present. Mar 20 14:52:38 crc kubenswrapper[4764]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 14:52:38 crc kubenswrapper[4764]: TS=$(date +%s) Mar 20 14:52:38 crc kubenswrapper[4764]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 14:52:38 crc kubenswrapper[4764]: HAS_LOGGED_INFO=0 Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: log_missing_certs(){ Mar 20 14:52:38 crc kubenswrapper[4764]: CUR_TS=$(date +%s) Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 14:52:38 crc kubenswrapper[4764]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 14:52:38 crc kubenswrapper[4764]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 14:52:38 crc kubenswrapper[4764]: HAS_LOGGED_INFO=1 Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: } Mar 20 14:52:38 crc kubenswrapper[4764]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 14:52:38 crc kubenswrapper[4764]: log_missing_certs Mar 20 14:52:38 crc kubenswrapper[4764]: sleep 5 Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 14:52:38 crc kubenswrapper[4764]: exec /usr/bin/kube-rbac-proxy \ Mar 20 14:52:38 crc kubenswrapper[4764]: --logtostderr \ Mar 20 14:52:38 crc kubenswrapper[4764]: --secure-listen-address=:9108 \ Mar 20 14:52:38 crc kubenswrapper[4764]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 14:52:38 crc kubenswrapper[4764]: --upstream=http://127.0.0.1:29108/ \ Mar 20 14:52:38 crc kubenswrapper[4764]: --tls-private-key-file=${TLS_PK} \ Mar 20 14:52:38 crc kubenswrapper[4764]: --tls-cert-file=${TLS_CERT} Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kpd8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-wczhh_openshift-ovn-kubernetes(84d09626-92c9-4c82-8dac-959885a658ca): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.573441 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.574562 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.576655 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"db53269dd981921c20c107c93ed95277235f8efd562b95c93bf077221bfe9351"} Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.576984 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ -f "/env/_master" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: set -o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: source "/env/_master" Mar 20 14:52:38 crc kubenswrapper[4764]: set +o allexport Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v4_join_subnet_opt= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "" != "" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v6_join_subnet_opt= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "" != "" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v4_transit_switch_subnet_opt= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "" != "" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v6_transit_switch_subnet_opt= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "" != "" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: dns_name_resolver_enabled_flag= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "false" == "true" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: persistent_ips_enabled_flag= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "true" == "true" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: # This is needed so that converting clusters from GA to TP Mar 20 14:52:38 crc kubenswrapper[4764]: # will rollout control plane pods as well Mar 20 14:52:38 crc kubenswrapper[4764]: network_segmentation_enabled_flag= Mar 20 14:52:38 crc kubenswrapper[4764]: multi_network_enabled_flag= Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "true" == "true" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: multi_network_enabled_flag="--enable-multi-network" Mar 20 14:52:38 crc kubenswrapper[4764]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 14:52:38 crc kubenswrapper[4764]: exec /usr/bin/ovnkube \ Mar 20 14:52:38 crc kubenswrapper[4764]: --enable-interconnect \ Mar 20 14:52:38 crc kubenswrapper[4764]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 14:52:38 crc kubenswrapper[4764]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 14:52:38 crc kubenswrapper[4764]: --metrics-enable-pprof \ Mar 20 14:52:38 crc kubenswrapper[4764]: --metrics-enable-config-duration \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${ovn_v4_join_subnet_opt} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${ovn_v6_join_subnet_opt} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${dns_name_resolver_enabled_flag} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${persistent_ips_enabled_flag} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${multi_network_enabled_flag} \ Mar 20 14:52:38 crc kubenswrapper[4764]: ${network_segmentation_enabled_flag} Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kpd8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-wczhh_openshift-ovn-kubernetes(84d09626-92c9-4c82-8dac-959885a658ca): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.577990 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.578441 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8nwvm" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.578439 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" podUID="84d09626-92c9-4c82-8dac-959885a658ca" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.578760 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"be7af832d46e757d200a178c752f6d5e08288b99d1fa689bcbf9e567dfbecd7c"} Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.579101 4764 scope.go:117] "RemoveContainer" containerID="ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.579302 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.579314 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.580728 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-skbfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.583532 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: W0320 14:52:38.584100 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07de6dd3_cfb3_49f7_9ac3_6c3a522ff349.slice/crio-074d8c1914f83e542847e978521c0faecb93f9e51a41e58dab93367c974441b0 WatchSource:0}: Error finding container 074d8c1914f83e542847e978521c0faecb93f9e51a41e58dab93367c974441b0: Status 404 returned error can't find the container with id 074d8c1914f83e542847e978521c0faecb93f9e51a41e58dab93367c974441b0 Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.587139 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-skbfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.588174 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q82f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-279gq_openshift-multus(07de6dd3-cfb3-49f7-9ac3-6c3a522ff349): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.589235 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.589515 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-279gq" podUID="07de6dd3-cfb3-49f7-9ac3-6c3a522ff349" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.592365 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: W0320 14:52:38.599408 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb133590_6855_4e67_92cd_353b342f66fe.slice/crio-de9374d81f419a0fabe08e127d331cda2f5127770e57dd067170c630e236d31f WatchSource:0}: Error finding container de9374d81f419a0fabe08e127d331cda2f5127770e57dd067170c630e236d31f: Status 404 returned error can't find the container with id de9374d81f419a0fabe08e127d331cda2f5127770e57dd067170c630e236d31f Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.601727 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.603558 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 14:52:38 crc kubenswrapper[4764]: set -uo pipefail Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 14:52:38 crc kubenswrapper[4764]: HOSTS_FILE="/etc/hosts" Mar 20 14:52:38 crc kubenswrapper[4764]: TEMP_FILE="/etc/hosts.tmp" Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: # Make a temporary file with the old hosts file's attributes. Mar 20 14:52:38 crc kubenswrapper[4764]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 14:52:38 crc kubenswrapper[4764]: echo "Failed to preserve hosts file. Exiting." Mar 20 14:52:38 crc kubenswrapper[4764]: exit 1 Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: while true; do Mar 20 14:52:38 crc kubenswrapper[4764]: declare -A svc_ips Mar 20 14:52:38 crc kubenswrapper[4764]: for svc in "${services[@]}"; do Mar 20 14:52:38 crc kubenswrapper[4764]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 14:52:38 crc kubenswrapper[4764]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 14:52:38 crc kubenswrapper[4764]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 14:52:38 crc kubenswrapper[4764]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 14:52:38 crc kubenswrapper[4764]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 14:52:38 crc kubenswrapper[4764]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 14:52:38 crc kubenswrapper[4764]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 14:52:38 crc kubenswrapper[4764]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 14:52:38 crc kubenswrapper[4764]: for i in ${!cmds[*]} Mar 20 14:52:38 crc kubenswrapper[4764]: do Mar 20 14:52:38 crc kubenswrapper[4764]: ips=($(eval "${cmds[i]}")) Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: svc_ips["${svc}"]="${ips[@]}" Mar 20 14:52:38 crc kubenswrapper[4764]: break Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: # Update /etc/hosts only if we get valid service IPs Mar 20 14:52:38 crc kubenswrapper[4764]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 14:52:38 crc kubenswrapper[4764]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 14:52:38 crc kubenswrapper[4764]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 14:52:38 crc kubenswrapper[4764]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 14:52:38 crc kubenswrapper[4764]: sleep 60 & wait Mar 20 14:52:38 crc kubenswrapper[4764]: continue Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: # Append resolver entries for services Mar 20 14:52:38 crc kubenswrapper[4764]: rc=0 Mar 20 14:52:38 crc kubenswrapper[4764]: for svc in "${!svc_ips[@]}"; do Mar 20 14:52:38 crc kubenswrapper[4764]: for ip in ${svc_ips[${svc}]}; do Mar 20 14:52:38 crc kubenswrapper[4764]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: if [[ $rc -ne 0 ]]; then Mar 20 14:52:38 crc kubenswrapper[4764]: sleep 60 & wait Mar 20 14:52:38 crc kubenswrapper[4764]: continue Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: Mar 20 14:52:38 crc kubenswrapper[4764]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 14:52:38 crc kubenswrapper[4764]: # Replace /etc/hosts with our modified version if needed Mar 20 14:52:38 crc kubenswrapper[4764]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 14:52:38 crc kubenswrapper[4764]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 14:52:38 crc kubenswrapper[4764]: fi Mar 20 14:52:38 crc kubenswrapper[4764]: sleep 60 & wait Mar 20 14:52:38 crc kubenswrapper[4764]: unset svc_ips Mar 20 14:52:38 crc kubenswrapper[4764]: done Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2sx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-8nwvm_openshift-dns(db133590-6855-4e67-92cd-353b342f66fe): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.609221 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-8nwvm" podUID="db133590-6855-4e67-92cd-353b342f66fe" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.610533 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: W0320 14:52:38.614562 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2a6c163_0457_4626_9bbb_5628a5155673.slice/crio-429a65d617a759fd722f43c9c099401deaf966bf0ca3a152452cfd1c3a335920 WatchSource:0}: Error finding container 429a65d617a759fd722f43c9c099401deaf966bf0ca3a152452cfd1c3a335920: Status 404 returned error can't find the container with id 429a65d617a759fd722f43c9c099401deaf966bf0ca3a152452cfd1c3a335920 Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.617418 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:38 crc kubenswrapper[4764]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 14:52:38 crc kubenswrapper[4764]: apiVersion: v1 Mar 20 14:52:38 crc kubenswrapper[4764]: clusters: Mar 20 14:52:38 crc kubenswrapper[4764]: - cluster: Mar 20 14:52:38 crc kubenswrapper[4764]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 14:52:38 crc kubenswrapper[4764]: server: https://api-int.crc.testing:6443 Mar 20 14:52:38 crc kubenswrapper[4764]: name: default-cluster Mar 20 14:52:38 crc kubenswrapper[4764]: contexts: Mar 20 14:52:38 crc kubenswrapper[4764]: - context: Mar 20 14:52:38 crc kubenswrapper[4764]: cluster: default-cluster Mar 20 14:52:38 crc kubenswrapper[4764]: namespace: default Mar 20 14:52:38 crc kubenswrapper[4764]: user: default-auth Mar 20 14:52:38 crc kubenswrapper[4764]: name: default-context Mar 20 14:52:38 crc kubenswrapper[4764]: current-context: default-context Mar 20 14:52:38 crc kubenswrapper[4764]: kind: Config Mar 20 14:52:38 crc kubenswrapper[4764]: preferences: {} Mar 20 14:52:38 crc kubenswrapper[4764]: users: Mar 20 14:52:38 crc kubenswrapper[4764]: - name: default-auth Mar 20 14:52:38 crc kubenswrapper[4764]: user: Mar 20 14:52:38 crc kubenswrapper[4764]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 14:52:38 crc kubenswrapper[4764]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 14:52:38 crc kubenswrapper[4764]: EOF Mar 20 14:52:38 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s727g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:38 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.619908 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.626691 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.634622 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.634686 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.634715 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.634750 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.634776 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:38Z","lastTransitionTime":"2026-03-20T14:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.643128 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.657477 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.667717 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.676322 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.691762 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.705906 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.709248 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.709323 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.709532 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:39.709492613 +0000 UTC m=+81.325681772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.718529 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.734315 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.737998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.738045 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.738085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.738109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.738144 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:38Z","lastTransitionTime":"2026-03-20T14:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.750564 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.765468 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.780475 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.793057 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.808220 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.811015 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.811195 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:52:39.81115427 +0000 UTC m=+81.427343439 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.811325 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.811439 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.811553 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.811580 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.811631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.811716 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.811739 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs podName:4c881e2f-a84e-4621-9e1e-f2197d698a63 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:39.811693737 +0000 UTC m=+81.427882896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs") pod "network-metrics-daemon-fb2k7" (UID: "4c881e2f-a84e-4621-9e1e-f2197d698a63") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.811749 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.811769 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.811862 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:39.811841451 +0000 UTC m=+81.428030620 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.811904 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.811942 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.811945 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.811968 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.811993 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:39.811979626 +0000 UTC m=+81.428168795 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:52:38 crc kubenswrapper[4764]: E0320 14:52:38.812044 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:39.812019917 +0000 UTC m=+81.428209126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.824839 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.841882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.841942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.841964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.842024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.842051 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:38Z","lastTransitionTime":"2026-03-20T14:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.842048 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.857534 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.873543 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.886174 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.906871 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.936584 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.945681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.945774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.945802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.945838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.945865 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:38Z","lastTransitionTime":"2026-03-20T14:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.957776 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.972823 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:38 crc kubenswrapper[4764]: I0320 14:52:38.987874 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.006319 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.023104 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.048668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.048709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.048718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.048734 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.048746 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.054994 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.055021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.055029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.055042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.055051 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.069727 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.075265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.075357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.075409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.075446 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.075465 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.091508 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.096853 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.096905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.096923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.096949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.096967 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.113470 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.118799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.118854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.118874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.118898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.118915 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.147003 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.158078 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.159172 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.159819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.159866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.159884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.159911 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.159929 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.160752 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.162702 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.163714 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.165179 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.165996 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.166846 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.168233 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.169297 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.170668 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.171403 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.173056 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.173875 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.174663 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.176045 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.176866 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.178196 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.178925 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.179859 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.181504 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.182225 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.182639 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.183730 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.184518 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.186067 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.186751 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.187827 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.188989 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.189231 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.189896 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.190658 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.192071 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.192864 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.194553 4764 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.194729 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.195456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.195517 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.195543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.195579 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.195609 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.201771 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.203062 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.203534 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.204687 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.205884 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.206807 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.208296 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.209210 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.210505 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.211016 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.212102 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.212832 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.213880 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.214351 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.215327 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.215893 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.217045 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.217686 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.218581 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.219063 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.220054 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.220669 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.221135 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.222759 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.253017 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.264880 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.277900 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.298557 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.299735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.299791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.299807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.299831 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.299845 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.335988 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.384363 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.403160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.403231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.403249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.403278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.403296 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.434678 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.466074 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.500046 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.506671 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.506715 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.506732 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.506755 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.506773 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.539107 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.581266 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.585079 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" event={"ID":"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349","Type":"ContainerStarted","Data":"074d8c1914f83e542847e978521c0faecb93f9e51a41e58dab93367c974441b0"} Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.587268 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8nwvm" event={"ID":"db133590-6855-4e67-92cd-353b342f66fe","Type":"ContainerStarted","Data":"de9374d81f419a0fabe08e127d331cda2f5127770e57dd067170c630e236d31f"} Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.590155 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"429a65d617a759fd722f43c9c099401deaf966bf0ca3a152452cfd1c3a335920"} Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.590440 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q82f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-279gq_openshift-multus(07de6dd3-cfb3-49f7-9ac3-6c3a522ff349): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.590611 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:39 crc kubenswrapper[4764]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 14:52:39 crc kubenswrapper[4764]: set -uo pipefail Mar 20 14:52:39 crc kubenswrapper[4764]: Mar 20 14:52:39 crc kubenswrapper[4764]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 14:52:39 crc kubenswrapper[4764]: Mar 20 14:52:39 crc kubenswrapper[4764]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 14:52:39 crc kubenswrapper[4764]: HOSTS_FILE="/etc/hosts" Mar 20 14:52:39 crc kubenswrapper[4764]: TEMP_FILE="/etc/hosts.tmp" Mar 20 14:52:39 crc kubenswrapper[4764]: Mar 20 14:52:39 crc kubenswrapper[4764]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 14:52:39 crc kubenswrapper[4764]: Mar 20 14:52:39 crc kubenswrapper[4764]: # Make a temporary file with the old hosts file's attributes. Mar 20 14:52:39 crc kubenswrapper[4764]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 14:52:39 crc kubenswrapper[4764]: echo "Failed to preserve hosts file. Exiting." Mar 20 14:52:39 crc kubenswrapper[4764]: exit 1 Mar 20 14:52:39 crc kubenswrapper[4764]: fi Mar 20 14:52:39 crc kubenswrapper[4764]: Mar 20 14:52:39 crc kubenswrapper[4764]: while true; do Mar 20 14:52:39 crc kubenswrapper[4764]: declare -A svc_ips Mar 20 14:52:39 crc kubenswrapper[4764]: for svc in "${services[@]}"; do Mar 20 14:52:39 crc kubenswrapper[4764]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 14:52:39 crc kubenswrapper[4764]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 14:52:39 crc kubenswrapper[4764]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 14:52:39 crc kubenswrapper[4764]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 14:52:39 crc kubenswrapper[4764]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 14:52:39 crc kubenswrapper[4764]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 14:52:39 crc kubenswrapper[4764]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 14:52:39 crc kubenswrapper[4764]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 14:52:39 crc kubenswrapper[4764]: for i in ${!cmds[*]} Mar 20 14:52:39 crc kubenswrapper[4764]: do Mar 20 14:52:39 crc kubenswrapper[4764]: ips=($(eval "${cmds[i]}")) Mar 20 14:52:39 crc kubenswrapper[4764]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 14:52:39 crc kubenswrapper[4764]: svc_ips["${svc}"]="${ips[@]}" Mar 20 14:52:39 crc kubenswrapper[4764]: break Mar 20 14:52:39 crc kubenswrapper[4764]: fi Mar 20 14:52:39 crc kubenswrapper[4764]: done Mar 20 14:52:39 crc kubenswrapper[4764]: done Mar 20 14:52:39 crc kubenswrapper[4764]: Mar 20 14:52:39 crc kubenswrapper[4764]: # Update /etc/hosts only if we get valid service IPs Mar 20 14:52:39 crc kubenswrapper[4764]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 14:52:39 crc kubenswrapper[4764]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 14:52:39 crc kubenswrapper[4764]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 14:52:39 crc kubenswrapper[4764]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 14:52:39 crc kubenswrapper[4764]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 14:52:39 crc kubenswrapper[4764]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 14:52:39 crc kubenswrapper[4764]: sleep 60 & wait Mar 20 14:52:39 crc kubenswrapper[4764]: continue Mar 20 14:52:39 crc kubenswrapper[4764]: fi Mar 20 14:52:39 crc kubenswrapper[4764]: Mar 20 14:52:39 crc kubenswrapper[4764]: # Append resolver entries for services Mar 20 14:52:39 crc kubenswrapper[4764]: rc=0 Mar 20 14:52:39 crc kubenswrapper[4764]: for svc in "${!svc_ips[@]}"; do Mar 20 14:52:39 crc kubenswrapper[4764]: for ip in ${svc_ips[${svc}]}; do Mar 20 14:52:39 crc kubenswrapper[4764]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 14:52:39 crc kubenswrapper[4764]: done Mar 20 14:52:39 crc kubenswrapper[4764]: done Mar 20 14:52:39 crc kubenswrapper[4764]: if [[ $rc -ne 0 ]]; then Mar 20 14:52:39 crc kubenswrapper[4764]: sleep 60 & wait Mar 20 14:52:39 crc kubenswrapper[4764]: continue Mar 20 14:52:39 crc kubenswrapper[4764]: fi Mar 20 14:52:39 crc kubenswrapper[4764]: Mar 20 14:52:39 crc kubenswrapper[4764]: Mar 20 14:52:39 crc kubenswrapper[4764]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 14:52:39 crc kubenswrapper[4764]: # Replace /etc/hosts with our modified version if needed Mar 20 14:52:39 crc kubenswrapper[4764]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 14:52:39 crc kubenswrapper[4764]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 14:52:39 crc kubenswrapper[4764]: fi Mar 20 14:52:39 crc kubenswrapper[4764]: sleep 60 & wait Mar 20 14:52:39 crc kubenswrapper[4764]: unset svc_ips Mar 20 14:52:39 crc kubenswrapper[4764]: done Mar 20 14:52:39 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2sx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-8nwvm_openshift-dns(db133590-6855-4e67-92cd-353b342f66fe): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:39 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.591742 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-8nwvm" podUID="db133590-6855-4e67-92cd-353b342f66fe" Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.591761 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-279gq" podUID="07de6dd3-cfb3-49f7-9ac3-6c3a522ff349" Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.592770 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:52:39 crc kubenswrapper[4764]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 14:52:39 crc kubenswrapper[4764]: apiVersion: v1 Mar 20 14:52:39 crc kubenswrapper[4764]: clusters: Mar 20 14:52:39 crc kubenswrapper[4764]: - cluster: Mar 20 14:52:39 crc kubenswrapper[4764]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 14:52:39 crc kubenswrapper[4764]: server: https://api-int.crc.testing:6443 Mar 20 14:52:39 crc kubenswrapper[4764]: name: default-cluster Mar 20 14:52:39 crc kubenswrapper[4764]: contexts: Mar 20 14:52:39 crc kubenswrapper[4764]: - context: Mar 20 14:52:39 crc kubenswrapper[4764]: cluster: default-cluster Mar 20 14:52:39 crc kubenswrapper[4764]: namespace: default Mar 20 14:52:39 crc kubenswrapper[4764]: user: default-auth Mar 20 14:52:39 crc kubenswrapper[4764]: name: default-context Mar 20 14:52:39 crc kubenswrapper[4764]: current-context: default-context Mar 20 14:52:39 crc kubenswrapper[4764]: kind: Config Mar 20 14:52:39 crc kubenswrapper[4764]: preferences: {} Mar 20 14:52:39 crc kubenswrapper[4764]: users: Mar 20 14:52:39 crc kubenswrapper[4764]: - name: default-auth Mar 20 14:52:39 crc kubenswrapper[4764]: user: Mar 20 14:52:39 crc kubenswrapper[4764]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 14:52:39 crc kubenswrapper[4764]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 14:52:39 crc kubenswrapper[4764]: EOF Mar 20 14:52:39 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s727g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 14:52:39 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.594145 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.609301 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.609349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.609368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.609422 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.609448 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.618684 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.658214 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.697646 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.712635 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.712694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.712713 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.712818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.712842 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.722706 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.722850 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.722940 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:41.722908976 +0000 UTC m=+83.339098145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.739877 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.776352 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.816914 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.817027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.817047 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.817078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.817097 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.824617 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.824835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.824927 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.825007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.825062 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.825101 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.825196 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs podName:4c881e2f-a84e-4621-9e1e-f2197d698a63 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:41.825165242 +0000 UTC m=+83.441354401 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs") pod "network-metrics-daemon-fb2k7" (UID: "4c881e2f-a84e-4621-9e1e-f2197d698a63") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.825307 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.825343 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.825367 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.825467 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.825497 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:41.825467901 +0000 UTC m=+83.441657060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.825511 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.825537 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.825603 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.825623 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:41.825593905 +0000 UTC m=+83.441783244 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.825731 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:41.825713859 +0000 UTC m=+83.441903018 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:52:39 crc kubenswrapper[4764]: E0320 14:52:39.826121 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:52:41.82607533 +0000 UTC m=+83.442264489 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.827517 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.828948 4764 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.891719 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.918060 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.920873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.920958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.920988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.921035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.921055 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:39Z","lastTransitionTime":"2026-03-20T14:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:39 crc kubenswrapper[4764]: I0320 14:52:39.961981 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.000679 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.024994 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.025188 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.025219 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.025260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.025282 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:40Z","lastTransitionTime":"2026-03-20T14:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.044027 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.082193 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.120123 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.125256 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.125294 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.125472 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:40 crc kubenswrapper[4764]: E0320 14:52:40.125636 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:52:40 crc kubenswrapper[4764]: E0320 14:52:40.125772 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:52:40 crc kubenswrapper[4764]: E0320 14:52:40.125908 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.126029 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:40 crc kubenswrapper[4764]: E0320 14:52:40.126160 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.129248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.129301 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.129320 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.129344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.129362 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:40Z","lastTransitionTime":"2026-03-20T14:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.160047 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.198292 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.235028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.235113 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.235317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.235338 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.235351 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:40Z","lastTransitionTime":"2026-03-20T14:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.238110 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.277879 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.323214 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.339180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.339256 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.339275 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.339302 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.339322 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:40Z","lastTransitionTime":"2026-03-20T14:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.357229 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.400997 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.440715 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.442553 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.442610 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.442631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.442660 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.442678 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:40Z","lastTransitionTime":"2026-03-20T14:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.480760 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.522317 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.546368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.546523 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.546554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.546596 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.546626 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:40Z","lastTransitionTime":"2026-03-20T14:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.561154 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.599469 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.639000 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.651207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.651258 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.651280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.651311 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.651333 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:40Z","lastTransitionTime":"2026-03-20T14:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.690724 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.723130 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.755430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.755537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.755561 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.755639 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.755680 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:40Z","lastTransitionTime":"2026-03-20T14:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.756769 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.805677 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.861468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.861949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.862166 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.862360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.862570 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:40Z","lastTransitionTime":"2026-03-20T14:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.965976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.966034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.966053 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.966077 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:40 crc kubenswrapper[4764]: I0320 14:52:40.966095 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:40Z","lastTransitionTime":"2026-03-20T14:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.070038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.070098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.070116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.070166 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.070186 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:41Z","lastTransitionTime":"2026-03-20T14:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.173477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.173538 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.173554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.173576 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.173594 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:41Z","lastTransitionTime":"2026-03-20T14:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.278057 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.278787 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.278811 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.278840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.278863 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:41Z","lastTransitionTime":"2026-03-20T14:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.382899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.382980 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.383006 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.383037 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.383062 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:41Z","lastTransitionTime":"2026-03-20T14:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.486298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.486346 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.486367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.486415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.486433 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:41Z","lastTransitionTime":"2026-03-20T14:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.589650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.589721 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.589743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.589778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.589796 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:41Z","lastTransitionTime":"2026-03-20T14:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.693603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.693676 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.693697 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.693716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.693762 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:41Z","lastTransitionTime":"2026-03-20T14:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.751066 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.751262 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.751351 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:45.751328471 +0000 UTC m=+87.367517710 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.797476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.797547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.797570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.797597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.797616 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:41Z","lastTransitionTime":"2026-03-20T14:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.852228 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.852458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.852503 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:52:45.852465742 +0000 UTC m=+87.468654901 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.852624 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.852683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.852704 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.852746 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.852774 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.852794 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.852860 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.852964 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:45.852888495 +0000 UTC m=+87.469077714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.852999 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.853028 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.853065 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.853085 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:45.853061881 +0000 UTC m=+87.469251080 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.853093 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.853162 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs podName:4c881e2f-a84e-4621-9e1e-f2197d698a63 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:45.853128482 +0000 UTC m=+87.469317661 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs") pod "network-metrics-daemon-fb2k7" (UID: "4c881e2f-a84e-4621-9e1e-f2197d698a63") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:52:41 crc kubenswrapper[4764]: E0320 14:52:41.853206 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:45.853186173 +0000 UTC m=+87.469375462 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.901155 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.901226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.901250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.901280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:41 crc kubenswrapper[4764]: I0320 14:52:41.901306 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:41Z","lastTransitionTime":"2026-03-20T14:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.003697 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.003832 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.003859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.003887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.003911 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:42Z","lastTransitionTime":"2026-03-20T14:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.088350 4764 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.109900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.109982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.109999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.110027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.110053 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:42Z","lastTransitionTime":"2026-03-20T14:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.125502 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.125566 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.125505 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.125506 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:42 crc kubenswrapper[4764]: E0320 14:52:42.125675 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:52:42 crc kubenswrapper[4764]: E0320 14:52:42.125838 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:52:42 crc kubenswrapper[4764]: E0320 14:52:42.126057 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:52:42 crc kubenswrapper[4764]: E0320 14:52:42.126261 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.214019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.214089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.214111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.214140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.214163 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:42Z","lastTransitionTime":"2026-03-20T14:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.317326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.317441 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.317465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.317492 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.317510 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:42Z","lastTransitionTime":"2026-03-20T14:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.419964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.420073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.420091 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.420120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.420140 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:42Z","lastTransitionTime":"2026-03-20T14:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.522970 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.523048 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.523073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.523102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.523127 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:42Z","lastTransitionTime":"2026-03-20T14:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.626277 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.626350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.626368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.626431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.626453 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:42Z","lastTransitionTime":"2026-03-20T14:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.730149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.730452 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.730483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.730518 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.730545 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:42Z","lastTransitionTime":"2026-03-20T14:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.833237 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.833300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.833314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.833336 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.833351 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:42Z","lastTransitionTime":"2026-03-20T14:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.936849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.936913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.936930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.936958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:42 crc kubenswrapper[4764]: I0320 14:52:42.936976 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:42Z","lastTransitionTime":"2026-03-20T14:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.040769 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.040892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.040911 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.040938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.040956 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:43Z","lastTransitionTime":"2026-03-20T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.143871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.143945 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.143969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.144003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.144031 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:43Z","lastTransitionTime":"2026-03-20T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.248037 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.248106 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.248128 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.248158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.248184 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:43Z","lastTransitionTime":"2026-03-20T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.351854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.351988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.352014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.352045 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.352062 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:43Z","lastTransitionTime":"2026-03-20T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.454689 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.454729 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.454740 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.454753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.454764 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:43Z","lastTransitionTime":"2026-03-20T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.556778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.556807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.556816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.556829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.556838 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:43Z","lastTransitionTime":"2026-03-20T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.659945 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.660024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.660049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.660079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.660099 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:43Z","lastTransitionTime":"2026-03-20T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.763637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.763701 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.763717 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.763745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.763763 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:43Z","lastTransitionTime":"2026-03-20T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.866889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.866967 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.866985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.867010 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.867028 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:43Z","lastTransitionTime":"2026-03-20T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.969672 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.969735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.969752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.969781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:43 crc kubenswrapper[4764]: I0320 14:52:43.969800 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:43Z","lastTransitionTime":"2026-03-20T14:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.073671 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.073735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.073752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.073776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.073797 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:44Z","lastTransitionTime":"2026-03-20T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.125759 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.125832 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:44 crc kubenswrapper[4764]: E0320 14:52:44.125964 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.125989 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.126097 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:44 crc kubenswrapper[4764]: E0320 14:52:44.126234 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:52:44 crc kubenswrapper[4764]: E0320 14:52:44.126411 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:52:44 crc kubenswrapper[4764]: E0320 14:52:44.126611 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.177952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.178023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.178049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.178080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.178103 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:44Z","lastTransitionTime":"2026-03-20T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.280881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.280931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.280944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.280969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.280985 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:44Z","lastTransitionTime":"2026-03-20T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.384789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.384869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.384893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.384926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.384958 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:44Z","lastTransitionTime":"2026-03-20T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.488667 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.488731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.488749 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.488773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.488792 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:44Z","lastTransitionTime":"2026-03-20T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.591200 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.591236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.591248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.591262 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.591270 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:44Z","lastTransitionTime":"2026-03-20T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.694251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.694305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.694324 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.694349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.694371 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:44Z","lastTransitionTime":"2026-03-20T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.797724 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.797785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.797805 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.797829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.797847 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:44Z","lastTransitionTime":"2026-03-20T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.900726 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.900847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.900875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.900907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:44 crc kubenswrapper[4764]: I0320 14:52:44.900930 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:44Z","lastTransitionTime":"2026-03-20T14:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.004157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.004222 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.004242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.004264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.004281 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:45Z","lastTransitionTime":"2026-03-20T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.107498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.107594 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.107613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.107637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.107653 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:45Z","lastTransitionTime":"2026-03-20T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.211222 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.211283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.211300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.211323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.211341 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:45Z","lastTransitionTime":"2026-03-20T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.314157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.314218 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.314234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.314260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.314684 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:45Z","lastTransitionTime":"2026-03-20T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.421186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.421246 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.421264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.421288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.421306 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:45Z","lastTransitionTime":"2026-03-20T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.523979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.524037 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.524055 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.524080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.524098 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:45Z","lastTransitionTime":"2026-03-20T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.626881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.626972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.626990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.627013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.627029 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:45Z","lastTransitionTime":"2026-03-20T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.730474 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.730528 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.730537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.730551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.730559 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:45Z","lastTransitionTime":"2026-03-20T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.800765 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.801009 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.801179 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:53.801140327 +0000 UTC m=+95.417329496 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.834661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.834833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.834864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.834898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.834919 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:45Z","lastTransitionTime":"2026-03-20T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.901831 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.902044 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:52:53.90200342 +0000 UTC m=+95.518192579 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.902124 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.902180 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.902253 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.902294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.902307 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.902414 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:53.902361791 +0000 UTC m=+95.518550960 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.902551 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.902633 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.902662 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.902555 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.902772 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:53.902728742 +0000 UTC m=+95.518917961 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.902562 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.902944 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.902969 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.902875 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs podName:4c881e2f-a84e-4621-9e1e-f2197d698a63 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:53.902826185 +0000 UTC m=+95.519015404 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs") pod "network-metrics-daemon-fb2k7" (UID: "4c881e2f-a84e-4621-9e1e-f2197d698a63") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:52:45 crc kubenswrapper[4764]: E0320 14:52:45.903058 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 14:52:53.903031621 +0000 UTC m=+95.519220790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.937967 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.938026 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.938044 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.938068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:45 crc kubenswrapper[4764]: I0320 14:52:45.938087 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:45Z","lastTransitionTime":"2026-03-20T14:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.041535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.041628 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.041651 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.041678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.041695 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:46Z","lastTransitionTime":"2026-03-20T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.125831 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.125892 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:46 crc kubenswrapper[4764]: E0320 14:52:46.125990 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.126011 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.126060 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:46 crc kubenswrapper[4764]: E0320 14:52:46.126192 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:52:46 crc kubenswrapper[4764]: E0320 14:52:46.126322 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:52:46 crc kubenswrapper[4764]: E0320 14:52:46.126517 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.144834 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.144897 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.144933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.144961 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.144980 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:46Z","lastTransitionTime":"2026-03-20T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.248488 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.248585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.248602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.248625 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.248643 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:46Z","lastTransitionTime":"2026-03-20T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.351429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.351537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.351558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.352095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.352139 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:46Z","lastTransitionTime":"2026-03-20T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.455159 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.455226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.455246 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.455270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.455287 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:46Z","lastTransitionTime":"2026-03-20T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.558228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.558276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.558287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.558311 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.558325 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:46Z","lastTransitionTime":"2026-03-20T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.661932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.661989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.662005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.662032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.662048 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:46Z","lastTransitionTime":"2026-03-20T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.764850 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.764944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.764963 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.764988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.765006 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:46Z","lastTransitionTime":"2026-03-20T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.868925 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.868996 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.869020 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.869049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.869071 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:46Z","lastTransitionTime":"2026-03-20T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.971935 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.972009 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.972031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.972060 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:46 crc kubenswrapper[4764]: I0320 14:52:46.972078 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:46Z","lastTransitionTime":"2026-03-20T14:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.082773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.082897 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.082914 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.082941 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.082960 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:47Z","lastTransitionTime":"2026-03-20T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.187404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.187478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.187497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.187522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.187538 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:47Z","lastTransitionTime":"2026-03-20T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.290629 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.290699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.290717 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.290744 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.290763 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:47Z","lastTransitionTime":"2026-03-20T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.393712 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.393764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.393782 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.393804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.393820 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:47Z","lastTransitionTime":"2026-03-20T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.496688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.496765 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.496783 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.496809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.496850 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:47Z","lastTransitionTime":"2026-03-20T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.599871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.599947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.599969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.600002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.600024 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:47Z","lastTransitionTime":"2026-03-20T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.703438 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.703516 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.703536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.703562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.703581 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:47Z","lastTransitionTime":"2026-03-20T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.806566 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.806611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.806624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.806641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.806652 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:47Z","lastTransitionTime":"2026-03-20T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.909866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.909938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.909963 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.909991 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:47 crc kubenswrapper[4764]: I0320 14:52:47.910009 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:47Z","lastTransitionTime":"2026-03-20T14:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.012273 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.012315 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.012332 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.012360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.012378 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:48Z","lastTransitionTime":"2026-03-20T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.115319 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.115374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.115411 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.115434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.115446 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:48Z","lastTransitionTime":"2026-03-20T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.125740 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.125801 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.125897 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:48 crc kubenswrapper[4764]: E0320 14:52:48.125950 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.126048 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:48 crc kubenswrapper[4764]: E0320 14:52:48.126207 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:52:48 crc kubenswrapper[4764]: E0320 14:52:48.126369 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:52:48 crc kubenswrapper[4764]: E0320 14:52:48.126572 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.218149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.218212 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.218231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.218260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.218279 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:48Z","lastTransitionTime":"2026-03-20T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.321202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.321303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.321321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.321371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.321425 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:48Z","lastTransitionTime":"2026-03-20T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.430278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.430344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.430367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.430462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.430492 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:48Z","lastTransitionTime":"2026-03-20T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.534463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.534538 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.534560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.534591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.534613 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:48Z","lastTransitionTime":"2026-03-20T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.637347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.637445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.637464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.637489 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.637507 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:48Z","lastTransitionTime":"2026-03-20T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.740816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.740894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.740915 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.740950 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.740976 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:48Z","lastTransitionTime":"2026-03-20T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.844581 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.844655 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.844681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.844718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.844737 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:48Z","lastTransitionTime":"2026-03-20T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.947985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.948060 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.948075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.948098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:48 crc kubenswrapper[4764]: I0320 14:52:48.948111 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:48Z","lastTransitionTime":"2026-03-20T14:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.051606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.051664 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.051682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.051707 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.051726 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.137995 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.153170 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.155643 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.155722 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.155743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.155770 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.155788 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.168611 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.185476 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.203235 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.213555 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.231512 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.242877 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.256788 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.258120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.258166 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.258177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.258199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.258211 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.273083 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.283572 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.302043 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.313970 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.334535 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.361333 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.361400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.361414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.361435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.361638 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.366928 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.465776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.466352 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.466363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.466402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.466416 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.470248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.470305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.470325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.470352 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.470371 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: E0320 14:52:49.483125 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.487110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.487144 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.487157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.487179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.487190 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: E0320 14:52:49.502110 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.511993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.512058 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.512076 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.512102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.512121 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: E0320 14:52:49.527472 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.531212 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.531237 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.531249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.531264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.531274 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: E0320 14:52:49.539490 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.543928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.543949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.543959 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.543972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.543982 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: E0320 14:52:49.557443 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: E0320 14:52:49.557673 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.568325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.568366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.568397 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.568415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.568427 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.627336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7kvvv" event={"ID":"7cf737ac-eb6b-499e-aa94-a37f8ced743b","Type":"ContainerStarted","Data":"750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f"} Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.644922 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.658745 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.670649 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.670707 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.670727 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.670757 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.670779 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.673585 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.688816 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.699501 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.713092 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.725774 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.736689 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.754233 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.766131 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.773514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.773571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.773591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.773621 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.773640 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.785302 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.812293 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.830001 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.845874 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.862724 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.876424 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.876479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.876497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.876524 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.876543 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.979647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.979732 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.979760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.979795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:49 crc kubenswrapper[4764]: I0320 14:52:49.979821 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:49Z","lastTransitionTime":"2026-03-20T14:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.082884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.082945 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.082962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.082986 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.083003 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:50Z","lastTransitionTime":"2026-03-20T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.125825 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:50 crc kubenswrapper[4764]: E0320 14:52:50.126102 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.126560 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.126714 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.126631 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:50 crc kubenswrapper[4764]: E0320 14:52:50.127131 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:52:50 crc kubenswrapper[4764]: E0320 14:52:50.127935 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:52:50 crc kubenswrapper[4764]: E0320 14:52:50.126988 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.186056 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.186103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.186114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.186133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.186145 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:50Z","lastTransitionTime":"2026-03-20T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.289322 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.289364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.289374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.289410 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.289423 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:50Z","lastTransitionTime":"2026-03-20T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.392595 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.392630 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.392641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.392658 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.392672 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:50Z","lastTransitionTime":"2026-03-20T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.495820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.495875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.495892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.495924 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.495949 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:50Z","lastTransitionTime":"2026-03-20T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.599574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.599654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.599667 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.599695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.599708 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:50Z","lastTransitionTime":"2026-03-20T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.634144 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19"} Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.649652 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.666563 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.681256 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.702609 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.702659 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.702672 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.702691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.702704 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:50Z","lastTransitionTime":"2026-03-20T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.702947 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.721301 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.742058 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.765344 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.779873 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.793345 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.806200 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.806267 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.806287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.806319 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.806343 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:50Z","lastTransitionTime":"2026-03-20T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.809231 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.827919 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.843429 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.857074 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.871210 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.884546 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.910965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.911055 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.911080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.911111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:50 crc kubenswrapper[4764]: I0320 14:52:50.911133 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:50Z","lastTransitionTime":"2026-03-20T14:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.014553 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.014632 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.014649 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.014675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.014695 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:51Z","lastTransitionTime":"2026-03-20T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.117454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.117548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.117567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.117601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.117620 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:51Z","lastTransitionTime":"2026-03-20T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.222478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.222544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.222564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.222592 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.222609 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:51Z","lastTransitionTime":"2026-03-20T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.327184 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.327223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.327236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.327254 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.327266 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:51Z","lastTransitionTime":"2026-03-20T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.430257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.430323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.430347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.430374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.430426 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:51Z","lastTransitionTime":"2026-03-20T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.533468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.533530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.533548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.533573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.533592 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:51Z","lastTransitionTime":"2026-03-20T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.636221 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.636308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.636327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.636353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.636376 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:51Z","lastTransitionTime":"2026-03-20T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.640503 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee"} Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.640577 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8"} Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.652710 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.663886 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.687634 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.718459 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.738741 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.738802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.738821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.738847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.738865 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:51Z","lastTransitionTime":"2026-03-20T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.741751 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.756281 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.768937 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.784498 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.802891 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.813537 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.833055 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:51Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.840920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.840972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.840993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.841018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.841035 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:51Z","lastTransitionTime":"2026-03-20T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.847649 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:51Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.861332 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:51Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.877629 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:51Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.892740 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:51Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.943442 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.943490 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.943502 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.943518 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:51 crc kubenswrapper[4764]: I0320 14:52:51.943529 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:51Z","lastTransitionTime":"2026-03-20T14:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.046713 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.046785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.046803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.046829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.046847 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:52Z","lastTransitionTime":"2026-03-20T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.125825 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.125900 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.125935 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.125968 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:52 crc kubenswrapper[4764]: E0320 14:52:52.126120 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:52:52 crc kubenswrapper[4764]: E0320 14:52:52.126253 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:52:52 crc kubenswrapper[4764]: E0320 14:52:52.126648 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:52:52 crc kubenswrapper[4764]: E0320 14:52:52.126783 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.126861 4764 scope.go:117] "RemoveContainer" containerID="ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666" Mar 20 14:52:52 crc kubenswrapper[4764]: E0320 14:52:52.127020 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.149217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.149272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.149288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.149317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.149333 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:52Z","lastTransitionTime":"2026-03-20T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.251889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.251930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.251940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.251955 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.251966 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:52Z","lastTransitionTime":"2026-03-20T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.354761 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.354802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.354823 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.354837 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.354862 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:52Z","lastTransitionTime":"2026-03-20T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.457175 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.457242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.457254 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.457272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.457287 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:52Z","lastTransitionTime":"2026-03-20T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.559479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.559546 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.559568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.559598 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.559620 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:52Z","lastTransitionTime":"2026-03-20T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.662307 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.662363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.662407 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.662433 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.662451 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:52Z","lastTransitionTime":"2026-03-20T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.765435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.765480 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.765498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.765520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.765536 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:52Z","lastTransitionTime":"2026-03-20T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.868717 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.868804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.868828 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.868861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.868880 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:52Z","lastTransitionTime":"2026-03-20T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.972331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.972414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.972433 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.972460 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:52 crc kubenswrapper[4764]: I0320 14:52:52.972477 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:52Z","lastTransitionTime":"2026-03-20T14:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.075611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.075671 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.075690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.075716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.075733 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:53Z","lastTransitionTime":"2026-03-20T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.178180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.178224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.178240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.178263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.178280 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:53Z","lastTransitionTime":"2026-03-20T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.283167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.283228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.283252 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.283282 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.283310 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:53Z","lastTransitionTime":"2026-03-20T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.386308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.386364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.386406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.386432 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.386452 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:53Z","lastTransitionTime":"2026-03-20T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.489027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.489074 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.489088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.489107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.489121 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:53Z","lastTransitionTime":"2026-03-20T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.591809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.591872 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.591889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.591919 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.591938 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:53Z","lastTransitionTime":"2026-03-20T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.648930 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016" exitCode=0 Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.649049 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.651866 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" event={"ID":"84d09626-92c9-4c82-8dac-959885a658ca","Type":"ContainerStarted","Data":"f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.651921 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" event={"ID":"84d09626-92c9-4c82-8dac-959885a658ca","Type":"ContainerStarted","Data":"f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.654060 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.667787 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.682355 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.694215 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.694300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.694329 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.694361 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.694439 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:53Z","lastTransitionTime":"2026-03-20T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.700572 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.717814 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.745401 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.781587 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.797162 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.797187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.797196 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.797210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.797220 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:53Z","lastTransitionTime":"2026-03-20T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.807074 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.832024 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.856981 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.876262 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.896638 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.897485 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.897837 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.898021 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:53:09.897987815 +0000 UTC m=+111.514176984 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.902034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.902092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.902109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.902133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.902150 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:53Z","lastTransitionTime":"2026-03-20T14:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.916299 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.940576 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.954610 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.967074 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.980113 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.991747 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:53Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.998025 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.998150 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.998198 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.998241 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:53 crc kubenswrapper[4764]: I0320 14:52:53.998268 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998374 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:53:09.998316111 +0000 UTC m=+111.614505240 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998406 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998455 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998479 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998504 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998531 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs podName:4c881e2f-a84e-4621-9e1e-f2197d698a63 nodeName:}" failed. No retries permitted until 2026-03-20 14:53:09.998507897 +0000 UTC m=+111.614697226 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs") pod "network-metrics-daemon-fb2k7" (UID: "4c881e2f-a84e-4621-9e1e-f2197d698a63") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998546 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998473 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998575 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998581 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998591 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:53:09.998557529 +0000 UTC m=+111.614746828 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998642 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 14:53:09.998625681 +0000 UTC m=+111.614814820 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:53 crc kubenswrapper[4764]: E0320 14:52:53.998664 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 14:53:09.998656242 +0000 UTC m=+111.614845381 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.002167 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.004304 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.004363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.004408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.004428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.004438 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:54Z","lastTransitionTime":"2026-03-20T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.018169 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.028674 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.041137 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.053445 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.067497 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.085612 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.098566 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.107242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.107284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.107297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.107316 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.107329 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:54Z","lastTransitionTime":"2026-03-20T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.116913 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.139670 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.151520 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.162819 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.175982 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.195559 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.195565 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.195701 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.195819 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:54 crc kubenswrapper[4764]: E0320 14:52:54.196475 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:52:54 crc kubenswrapper[4764]: E0320 14:52:54.196530 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:52:54 crc kubenswrapper[4764]: E0320 14:52:54.196621 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:52:54 crc kubenswrapper[4764]: E0320 14:52:54.197155 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.210758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.210788 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.210798 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.210811 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.210822 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:54Z","lastTransitionTime":"2026-03-20T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.318618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.319448 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.319471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.319544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.319564 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:54Z","lastTransitionTime":"2026-03-20T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.435487 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.435534 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.435551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.435579 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.435599 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:54Z","lastTransitionTime":"2026-03-20T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.540868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.540913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.540925 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.540944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.540957 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:54Z","lastTransitionTime":"2026-03-20T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.645872 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.645919 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.645932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.645951 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.645965 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:54Z","lastTransitionTime":"2026-03-20T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.664056 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.664115 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.670354 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.670479 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.670502 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.670521 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.670539 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.672600 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4m5r" event={"ID":"1f85a77d-475e-43c9-8181-093451bc058f","Type":"ContainerStarted","Data":"eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.674311 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8nwvm" event={"ID":"db133590-6855-4e67-92cd-353b342f66fe","Type":"ContainerStarted","Data":"e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.687852 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.702439 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.722701 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.744339 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.748716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.748770 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.748780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.748800 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.748811 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:54Z","lastTransitionTime":"2026-03-20T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.767899 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.785558 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.799773 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.817617 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.837837 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.851895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.851953 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.851966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.851983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.851996 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:54Z","lastTransitionTime":"2026-03-20T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.854464 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.868124 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.879203 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.895903 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.914991 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.933071 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.952346 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.953834 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.953870 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.953883 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.953901 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.953914 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:54Z","lastTransitionTime":"2026-03-20T14:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.969795 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.984346 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:54 crc kubenswrapper[4764]: I0320 14:52:54.999206 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:54Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.020593 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:55Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.040162 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:55Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.056635 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.056684 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.056702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.056728 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.056745 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:55Z","lastTransitionTime":"2026-03-20T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.061941 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:55Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.076438 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:55Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.095469 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:55Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.108031 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:55Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.124224 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:55Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.143994 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:55Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.159715 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.159760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.159774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.159791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.159806 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:55Z","lastTransitionTime":"2026-03-20T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.166465 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:55Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.189848 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:55Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.237901 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:55Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.270202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.270251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.270264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.270285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.270298 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:55Z","lastTransitionTime":"2026-03-20T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.375279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.375343 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.375360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.375413 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.375431 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:55Z","lastTransitionTime":"2026-03-20T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.480901 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.481335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.481347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.481371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.481403 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:55Z","lastTransitionTime":"2026-03-20T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.584455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.584530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.584551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.584581 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.584602 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:55Z","lastTransitionTime":"2026-03-20T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.682564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6"} Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.686859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.686913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.686926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.686945 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.686959 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:55Z","lastTransitionTime":"2026-03-20T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.789971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.790034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.790053 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.790078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.790117 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:55Z","lastTransitionTime":"2026-03-20T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.893697 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.893969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.894059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.894146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.894223 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:55Z","lastTransitionTime":"2026-03-20T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.997674 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.998095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.998342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.998543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:55 crc kubenswrapper[4764]: I0320 14:52:55.998599 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:55Z","lastTransitionTime":"2026-03-20T14:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.102201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.102289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.102317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.102356 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.102410 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:56Z","lastTransitionTime":"2026-03-20T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.125979 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.126077 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:56 crc kubenswrapper[4764]: E0320 14:52:56.126605 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.126257 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.126137 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:56 crc kubenswrapper[4764]: E0320 14:52:56.126781 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:52:56 crc kubenswrapper[4764]: E0320 14:52:56.126969 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:52:56 crc kubenswrapper[4764]: E0320 14:52:56.127140 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.204980 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.205056 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.205073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.205105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.205120 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:56Z","lastTransitionTime":"2026-03-20T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.307617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.307680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.307698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.307724 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.307742 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:56Z","lastTransitionTime":"2026-03-20T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.411760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.411882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.411939 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.412017 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.412076 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:56Z","lastTransitionTime":"2026-03-20T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.515022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.515086 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.515105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.515130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.515148 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:56Z","lastTransitionTime":"2026-03-20T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.618114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.618192 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.618211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.618243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.618261 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:56Z","lastTransitionTime":"2026-03-20T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.688982 4764 generic.go:334] "Generic (PLEG): container finished" podID="07de6dd3-cfb3-49f7-9ac3-6c3a522ff349" containerID="c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062" exitCode=0 Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.689070 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" event={"ID":"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349","Type":"ContainerDied","Data":"c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062"} Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.713537 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:56Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.733514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.733564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.733575 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.733594 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.733604 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:56Z","lastTransitionTime":"2026-03-20T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.746876 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:56Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.820444 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:56Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.838132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.838195 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.838213 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.838237 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.838256 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:56Z","lastTransitionTime":"2026-03-20T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.847338 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:56Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.867223 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:56Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.887586 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:56Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.907941 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:56Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.930590 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:56Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.945306 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.945376 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.945419 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.945445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.945465 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:56Z","lastTransitionTime":"2026-03-20T14:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.956342 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:56Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.971751 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:56Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:56 crc kubenswrapper[4764]: I0320 14:52:56.994320 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:56Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.015551 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.032628 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.047553 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.047630 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.047652 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.047677 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.047693 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:57Z","lastTransitionTime":"2026-03-20T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.049783 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.063464 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.151218 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.151349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.151507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.151550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.151576 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:57Z","lastTransitionTime":"2026-03-20T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.254296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.254364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.254416 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.254441 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.254459 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:57Z","lastTransitionTime":"2026-03-20T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.358759 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.358825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.358848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.358875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.358893 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:57Z","lastTransitionTime":"2026-03-20T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.462249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.462315 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.462337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.462372 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.462425 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:57Z","lastTransitionTime":"2026-03-20T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.569628 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.569680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.569698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.569722 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.569740 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:57Z","lastTransitionTime":"2026-03-20T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.672874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.672927 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.672938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.672956 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.672967 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:57Z","lastTransitionTime":"2026-03-20T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.701105 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765"} Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.704046 4764 generic.go:334] "Generic (PLEG): container finished" podID="07de6dd3-cfb3-49f7-9ac3-6c3a522ff349" containerID="28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3" exitCode=0 Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.704132 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" event={"ID":"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349","Type":"ContainerDied","Data":"28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3"} Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.761516 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.779708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.779767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.779780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.779807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.779823 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:57Z","lastTransitionTime":"2026-03-20T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.801293 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.819324 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.834085 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.847637 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.867606 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.881251 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.886375 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.886448 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.886462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.886484 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.886496 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:57Z","lastTransitionTime":"2026-03-20T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.899769 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.913630 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.924660 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.936465 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.957267 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.972634 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.986157 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.989717 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.989748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.989759 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.989773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:57 crc kubenswrapper[4764]: I0320 14:52:57.989783 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:57Z","lastTransitionTime":"2026-03-20T14:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.001724 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:57Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.093012 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.093084 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.093104 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.093132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.093154 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:58Z","lastTransitionTime":"2026-03-20T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.127821 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.127883 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.127976 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.127993 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:52:58 crc kubenswrapper[4764]: E0320 14:52:58.128112 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:52:58 crc kubenswrapper[4764]: E0320 14:52:58.128287 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:52:58 crc kubenswrapper[4764]: E0320 14:52:58.128453 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:52:58 crc kubenswrapper[4764]: E0320 14:52:58.128556 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.196626 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.196711 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.196737 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.196773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.196801 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:58Z","lastTransitionTime":"2026-03-20T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.299890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.299986 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.300005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.300031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.300049 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:58Z","lastTransitionTime":"2026-03-20T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.402976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.403045 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.403070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.403096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.403113 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:58Z","lastTransitionTime":"2026-03-20T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.506102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.506169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.506190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.506217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.506237 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:58Z","lastTransitionTime":"2026-03-20T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.610149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.610235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.610257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.610283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.610302 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:58Z","lastTransitionTime":"2026-03-20T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.712619 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.712683 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.712700 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.712724 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.712744 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:58Z","lastTransitionTime":"2026-03-20T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.713332 4764 generic.go:334] "Generic (PLEG): container finished" podID="07de6dd3-cfb3-49f7-9ac3-6c3a522ff349" containerID="29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8" exitCode=0 Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.713422 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" event={"ID":"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349","Type":"ContainerDied","Data":"29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8"} Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.756117 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.781368 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.816862 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.818203 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.818243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.818255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.818272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.818284 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:58Z","lastTransitionTime":"2026-03-20T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.842644 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.856091 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.869706 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.886745 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.897832 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.921063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.921098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.921110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.921127 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.921142 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:58Z","lastTransitionTime":"2026-03-20T14:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.924988 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.943237 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.960186 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.979933 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:58 crc kubenswrapper[4764]: I0320 14:52:58.998038 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:58Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.014199 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.023303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.023328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.023336 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.023350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.023359 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.026261 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.126101 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.126197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.126217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.126250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.126272 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.143621 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.159157 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.179542 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.197369 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.214278 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.230042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.230087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.230103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.230128 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.230144 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.233315 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.250049 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.264196 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.282637 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.295797 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.314861 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.334567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.334625 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.334646 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.334671 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.334691 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.340901 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.353482 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.372233 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.384211 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.437913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.437964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.438102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.438124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.438136 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.541030 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.541069 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.541077 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.541093 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.541102 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.644041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.644121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.644137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.644160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.644175 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.721786 4764 generic.go:334] "Generic (PLEG): container finished" podID="07de6dd3-cfb3-49f7-9ac3-6c3a522ff349" containerID="4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a" exitCode=0 Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.721885 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" event={"ID":"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349","Type":"ContainerDied","Data":"4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.736582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.736664 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.736683 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.736695 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.747955 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.748401 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.748445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.748457 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.748477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.748490 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.773244 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.786927 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.786981 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.786990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.787004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.787014 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.799965 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.801681 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.803196 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:52:59 crc kubenswrapper[4764]: E0320 14:52:59.812911 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.829174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.829228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.829248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.829275 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.829294 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.831229 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.852425 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: E0320 14:52:59.854452 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.860823 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.860894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.860940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.860979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.861002 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.877340 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: E0320 14:52:59.881871 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.888083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.888150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.888167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.888196 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.888215 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.894587 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.907565 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: E0320 14:52:59.908370 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.912980 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.913036 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.913056 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.913079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.913096 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.926014 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: E0320 14:52:59.931245 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: E0320 14:52:59.931613 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.933912 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.934046 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.934124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.934223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.934318 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:52:59Z","lastTransitionTime":"2026-03-20T14:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.937460 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.960458 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.985511 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:52:59 crc kubenswrapper[4764]: I0320 14:52:59.998935 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:52:59Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.012809 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.033739 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.036772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.036815 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.036829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.036849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.036863 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:00Z","lastTransitionTime":"2026-03-20T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.053644 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.066513 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.083449 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.105899 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.121836 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.125416 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.125463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.125631 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:00 crc kubenswrapper[4764]: E0320 14:53:00.125635 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:00 crc kubenswrapper[4764]: E0320 14:53:00.126032 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:00 crc kubenswrapper[4764]: E0320 14:53:00.126236 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.131067 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:00 crc kubenswrapper[4764]: E0320 14:53:00.131460 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.139570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.139655 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.139678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.139714 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.139735 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:00Z","lastTransitionTime":"2026-03-20T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.148746 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.170207 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.190825 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.213187 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.232869 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.241882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.241927 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.241974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.241999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.242014 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:00Z","lastTransitionTime":"2026-03-20T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.249040 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.266428 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.285651 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.300507 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.315513 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.345275 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.345335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.345348 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.345370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.345415 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:00Z","lastTransitionTime":"2026-03-20T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.447954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.448058 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.448085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.448117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.448139 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:00Z","lastTransitionTime":"2026-03-20T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.551467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.551543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.551570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.551602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.551623 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:00Z","lastTransitionTime":"2026-03-20T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.654478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.654549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.654604 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.654645 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.654668 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:00Z","lastTransitionTime":"2026-03-20T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.745515 4764 generic.go:334] "Generic (PLEG): container finished" podID="07de6dd3-cfb3-49f7-9ac3-6c3a522ff349" containerID="4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55" exitCode=0 Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.745674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" event={"ID":"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349","Type":"ContainerDied","Data":"4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55"} Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.757814 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.757880 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.757898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.757924 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.757942 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:00Z","lastTransitionTime":"2026-03-20T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.766425 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.791724 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.811981 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.832030 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.844729 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.861027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.861055 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.861064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.861081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.861091 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:00Z","lastTransitionTime":"2026-03-20T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.864194 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.883001 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.899794 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.917188 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.936226 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.951329 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.965803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.965877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.965897 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.966342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.966429 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:00Z","lastTransitionTime":"2026-03-20T14:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:00 crc kubenswrapper[4764]: I0320 14:53:00.979174 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:00Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.011441 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.038060 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.053322 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.069972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.070026 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.070040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.070071 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.070089 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:01Z","lastTransitionTime":"2026-03-20T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.173162 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.173202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.173213 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.173230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.173243 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:01Z","lastTransitionTime":"2026-03-20T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.276177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.276240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.276259 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.276282 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.276297 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:01Z","lastTransitionTime":"2026-03-20T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.378974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.379029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.379050 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.379073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.379090 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:01Z","lastTransitionTime":"2026-03-20T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.481567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.481613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.481623 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.481640 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.481651 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:01Z","lastTransitionTime":"2026-03-20T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.585171 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.585220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.585232 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.585249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.585261 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:01Z","lastTransitionTime":"2026-03-20T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.688135 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.688198 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.688215 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.688241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.688259 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:01Z","lastTransitionTime":"2026-03-20T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.751370 4764 generic.go:334] "Generic (PLEG): container finished" podID="07de6dd3-cfb3-49f7-9ac3-6c3a522ff349" containerID="02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4" exitCode=0 Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.751433 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" event={"ID":"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349","Type":"ContainerDied","Data":"02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4"} Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.784623 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.796309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.796342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.796353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.796369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.796399 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:01Z","lastTransitionTime":"2026-03-20T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.799647 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.819426 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.833188 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.856308 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.876772 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.898883 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.904106 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.904182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.904203 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.904227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.904315 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:01Z","lastTransitionTime":"2026-03-20T14:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.941467 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.964800 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.981108 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:01 crc kubenswrapper[4764]: I0320 14:53:01.998455 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:01Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.015257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.015452 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.015561 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.015692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.015805 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:02Z","lastTransitionTime":"2026-03-20T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.035298 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.057516 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.073743 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.088810 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.119141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.119447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.119602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.119750 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.119907 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:02Z","lastTransitionTime":"2026-03-20T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.125605 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.125757 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.125630 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:02 crc kubenswrapper[4764]: E0320 14:53:02.125939 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.126028 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:02 crc kubenswrapper[4764]: E0320 14:53:02.126176 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:02 crc kubenswrapper[4764]: E0320 14:53:02.126313 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:02 crc kubenswrapper[4764]: E0320 14:53:02.126752 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.223189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.223233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.223244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.223265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.223277 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:02Z","lastTransitionTime":"2026-03-20T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.326819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.327186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.327205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.327282 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.327303 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:02Z","lastTransitionTime":"2026-03-20T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.430334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.430413 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.430431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.430457 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.430474 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:02Z","lastTransitionTime":"2026-03-20T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.533968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.535822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.536040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.536251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.536504 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:02Z","lastTransitionTime":"2026-03-20T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.640710 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.640793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.640821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.640857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.640878 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:02Z","lastTransitionTime":"2026-03-20T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.744205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.744284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.744307 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.744338 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.744359 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:02Z","lastTransitionTime":"2026-03-20T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.758728 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/0.log" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.763579 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc" exitCode=1 Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.763678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc"} Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.764763 4764 scope.go:117] "RemoveContainer" containerID="9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.771132 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" event={"ID":"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349","Type":"ContainerStarted","Data":"e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45"} Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.785372 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.805958 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.825612 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.849532 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.849808 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.849854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.849871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.849896 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.849914 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:02Z","lastTransitionTime":"2026-03-20T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.870577 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.890823 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.908782 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.933049 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.953567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.953631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.953649 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.953674 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.953691 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:02Z","lastTransitionTime":"2026-03-20T14:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.954265 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.971624 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:02 crc kubenswrapper[4764]: I0320 14:53:02.991225 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:02Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.007602 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.035783 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.056289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.056345 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.056363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.056415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.056433 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:03Z","lastTransitionTime":"2026-03-20T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.060127 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 14:53:02.012082 6592 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 14:53:02.012138 6592 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 14:53:02.012149 6592 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 14:53:02.012233 6592 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 14:53:02.012294 6592 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 14:53:02.012312 6592 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 14:53:02.012348 6592 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 14:53:02.012478 6592 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 14:53:02.012487 6592 factory.go:656] Stopping watch factory\\\\nI0320 14:53:02.012521 6592 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 14:53:02.012529 6592 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 14:53:02.012542 6592 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:02.012542 6592 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 14:53:02.012550 6592 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:02.012574 6592 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.080521 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.097343 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.118418 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.150485 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.159266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.159364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.159413 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.159439 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.159455 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:03Z","lastTransitionTime":"2026-03-20T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.175741 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.198769 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.221901 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.246081 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.262234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.262284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.262303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.262327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.262346 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:03Z","lastTransitionTime":"2026-03-20T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.264759 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.288959 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.305933 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.320723 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.354056 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.364925 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.364992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.365007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.365030 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.365049 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:03Z","lastTransitionTime":"2026-03-20T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.376349 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.394299 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.418709 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 14:53:02.012082 6592 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 14:53:02.012138 6592 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 14:53:02.012149 6592 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 14:53:02.012233 6592 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 14:53:02.012294 6592 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 14:53:02.012312 6592 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 14:53:02.012348 6592 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 14:53:02.012478 6592 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 14:53:02.012487 6592 factory.go:656] Stopping watch factory\\\\nI0320 14:53:02.012521 6592 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 14:53:02.012529 6592 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 14:53:02.012542 6592 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:02.012542 6592 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 14:53:02.012550 6592 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:02.012574 6592 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.467704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.467754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.467770 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.467795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.467810 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:03Z","lastTransitionTime":"2026-03-20T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.570680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.570735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.570755 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.570779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.570795 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:03Z","lastTransitionTime":"2026-03-20T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.674425 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.674503 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.674528 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.674566 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.674591 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:03Z","lastTransitionTime":"2026-03-20T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.779533 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.779613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.779654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.779690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.779715 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:03Z","lastTransitionTime":"2026-03-20T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.786860 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/0.log" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.791367 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b"} Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.792086 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.816239 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.836737 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.853816 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.876301 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.882829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.882899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.882919 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.882947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.882968 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:03Z","lastTransitionTime":"2026-03-20T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.895592 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.920614 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.946912 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 14:53:02.012082 6592 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 14:53:02.012138 6592 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 14:53:02.012149 6592 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 14:53:02.012233 6592 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 14:53:02.012294 6592 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 14:53:02.012312 6592 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 14:53:02.012348 6592 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 14:53:02.012478 6592 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 14:53:02.012487 6592 factory.go:656] Stopping watch factory\\\\nI0320 14:53:02.012521 6592 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 14:53:02.012529 6592 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 14:53:02.012542 6592 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:02.012542 6592 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 14:53:02.012550 6592 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:02.012574 6592 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.964152 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.984406 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:03Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.986089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.986140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.986160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.986228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:03 crc kubenswrapper[4764]: I0320 14:53:03.986249 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:03Z","lastTransitionTime":"2026-03-20T14:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.006254 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.027752 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.045271 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.066619 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.089616 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.089680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.089698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.089723 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.089741 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:04Z","lastTransitionTime":"2026-03-20T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.090841 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.108623 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.126195 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.126229 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.126231 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:04 crc kubenswrapper[4764]: E0320 14:53:04.126336 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.126442 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:04 crc kubenswrapper[4764]: E0320 14:53:04.126719 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:04 crc kubenswrapper[4764]: E0320 14:53:04.127154 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:04 crc kubenswrapper[4764]: E0320 14:53:04.127292 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.127448 4764 scope.go:117] "RemoveContainer" containerID="ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666" Mar 20 14:53:04 crc kubenswrapper[4764]: E0320 14:53:04.127694 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.192663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.192721 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.192745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.192774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.192797 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:04Z","lastTransitionTime":"2026-03-20T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.297312 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.297422 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.297456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.297494 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.297521 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:04Z","lastTransitionTime":"2026-03-20T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.402777 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.402855 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.402873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.402903 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.402926 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:04Z","lastTransitionTime":"2026-03-20T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.505703 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.505774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.505791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.505816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.505834 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:04Z","lastTransitionTime":"2026-03-20T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.609455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.609510 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.609526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.609549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.609565 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:04Z","lastTransitionTime":"2026-03-20T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.712680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.712734 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.712752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.712773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.712789 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:04Z","lastTransitionTime":"2026-03-20T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.798000 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/1.log" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.799312 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/0.log" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.803078 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b" exitCode=1 Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.803148 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b"} Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.803222 4764 scope.go:117] "RemoveContainer" containerID="9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.804562 4764 scope.go:117] "RemoveContainer" containerID="396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b" Mar 20 14:53:04 crc kubenswrapper[4764]: E0320 14:53:04.804869 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.818339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.818416 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.818433 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.818455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.818471 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:04Z","lastTransitionTime":"2026-03-20T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.834526 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.855239 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.876221 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.891875 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.915344 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.926490 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.926547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.926563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.926583 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.926622 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:04Z","lastTransitionTime":"2026-03-20T14:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.945482 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a1e3b348b5ecf870ddcbe4ab27521df056302085e8d76971d9bc4bb3a6d50dc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 14:53:02.012082 6592 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 14:53:02.012138 6592 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 14:53:02.012149 6592 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 14:53:02.012233 6592 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 14:53:02.012294 6592 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 14:53:02.012312 6592 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 14:53:02.012348 6592 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 14:53:02.012478 6592 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 14:53:02.012487 6592 factory.go:656] Stopping watch factory\\\\nI0320 14:53:02.012521 6592 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 14:53:02.012529 6592 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 14:53:02.012542 6592 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:02.012542 6592 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 14:53:02.012550 6592 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:02.012574 6592 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:03Z\\\",\\\"message\\\":\\\"03.899586 6806 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:03.899596 6806 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:03.899694 6806 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.899839 6806 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 14:53:03.900279 6806 factory.go:656] Stopping watch factory\\\\nI0320 14:53:03.900344 6806 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.900526 6806 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.900845 6806 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.901107 6806 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.966213 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.984848 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:04 crc kubenswrapper[4764]: I0320 14:53:04.999346 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:04Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.015550 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.029900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.029970 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.029989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.030014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.030031 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:05Z","lastTransitionTime":"2026-03-20T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.036466 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.056525 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.078297 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.093539 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.114371 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.132752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.132984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.133098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.133133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.133157 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:05Z","lastTransitionTime":"2026-03-20T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.152666 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.236361 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.236424 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.236435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.236455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.236467 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:05Z","lastTransitionTime":"2026-03-20T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.340453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.340505 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.340520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.340544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.340561 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:05Z","lastTransitionTime":"2026-03-20T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.442771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.442812 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.442831 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.442854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.442872 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:05Z","lastTransitionTime":"2026-03-20T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.545680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.545949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.546060 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.546170 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.546263 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:05Z","lastTransitionTime":"2026-03-20T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.650209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.650296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.650323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.650356 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.650417 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:05Z","lastTransitionTime":"2026-03-20T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.753793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.754077 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.754201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.754329 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.754516 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:05Z","lastTransitionTime":"2026-03-20T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.809530 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/1.log" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.815564 4764 scope.go:117] "RemoveContainer" containerID="396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b" Mar 20 14:53:05 crc kubenswrapper[4764]: E0320 14:53:05.816816 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.832254 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.854781 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.860227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.860274 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.860291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.860314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.860331 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:05Z","lastTransitionTime":"2026-03-20T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.908907 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:03Z\\\",\\\"message\\\":\\\"03.899586 6806 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:03.899596 6806 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:03.899694 6806 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.899839 6806 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 14:53:03.900279 6806 factory.go:656] Stopping watch factory\\\\nI0320 14:53:03.900344 6806 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.900526 6806 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.900845 6806 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.901107 6806 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.947770 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.960264 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.962936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.963005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.963021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.963039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.963052 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:05Z","lastTransitionTime":"2026-03-20T14:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.974039 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:05 crc kubenswrapper[4764]: I0320 14:53:05.995028 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:05Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.012433 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:06Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.034393 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:06Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.048853 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:06Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.065883 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.065922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.065933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.065948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.065974 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:06Z","lastTransitionTime":"2026-03-20T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.068085 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:06Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.085238 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:06Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.098021 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:06Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.125796 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.125897 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.125921 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:06 crc kubenswrapper[4764]: E0320 14:53:06.126020 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.126087 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:06 crc kubenswrapper[4764]: E0320 14:53:06.126215 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:06 crc kubenswrapper[4764]: E0320 14:53:06.126356 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.126428 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:06Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:06 crc kubenswrapper[4764]: E0320 14:53:06.126665 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.143005 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:06Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.157202 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:06Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.168796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.168855 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.168873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.168899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.168916 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:06Z","lastTransitionTime":"2026-03-20T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.271835 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.271928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.271946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.271971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.271990 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:06Z","lastTransitionTime":"2026-03-20T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.375212 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.375272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.375291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.375314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.375335 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:06Z","lastTransitionTime":"2026-03-20T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.478369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.478474 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.478495 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.478520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.478537 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:06Z","lastTransitionTime":"2026-03-20T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.581559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.581624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.581644 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.581669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.581686 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:06Z","lastTransitionTime":"2026-03-20T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.684187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.684236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.684253 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.684278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.684296 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:06Z","lastTransitionTime":"2026-03-20T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.787428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.787502 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.787519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.787544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.787562 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:06Z","lastTransitionTime":"2026-03-20T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.891138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.891196 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.891214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.891238 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.891256 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:06Z","lastTransitionTime":"2026-03-20T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.994645 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.994699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.994717 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.994743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:06 crc kubenswrapper[4764]: I0320 14:53:06.994760 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:06Z","lastTransitionTime":"2026-03-20T14:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.138776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.138954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.139069 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.139157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.139186 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:07Z","lastTransitionTime":"2026-03-20T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.242073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.242155 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.242179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.242208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.242232 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:07Z","lastTransitionTime":"2026-03-20T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.345762 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.345824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.345842 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.345866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.345889 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:07Z","lastTransitionTime":"2026-03-20T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.449749 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.450138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.450321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.450602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.450820 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:07Z","lastTransitionTime":"2026-03-20T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.554355 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.554445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.554465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.554488 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.554508 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:07Z","lastTransitionTime":"2026-03-20T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.657215 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.657597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.657765 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.657902 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.658024 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:07Z","lastTransitionTime":"2026-03-20T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.760965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.761023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.761041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.761065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.761088 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:07Z","lastTransitionTime":"2026-03-20T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.864295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.864688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.864897 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.865123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.865339 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:07Z","lastTransitionTime":"2026-03-20T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.968913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.968989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.969010 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.969041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:07 crc kubenswrapper[4764]: I0320 14:53:07.969071 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:07Z","lastTransitionTime":"2026-03-20T14:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.072985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.073043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.073061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.073086 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.073104 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:08Z","lastTransitionTime":"2026-03-20T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.125874 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.125873 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.125939 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.126482 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:08 crc kubenswrapper[4764]: E0320 14:53:08.126737 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:08 crc kubenswrapper[4764]: E0320 14:53:08.127013 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:08 crc kubenswrapper[4764]: E0320 14:53:08.127146 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:08 crc kubenswrapper[4764]: E0320 14:53:08.127244 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.176915 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.176965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.176984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.177008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.177025 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:08Z","lastTransitionTime":"2026-03-20T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.279994 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.280083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.280101 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.280127 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.280146 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:08Z","lastTransitionTime":"2026-03-20T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.383994 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.384047 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.384068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.384091 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.384109 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:08Z","lastTransitionTime":"2026-03-20T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.488093 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.488558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.488578 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.488603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.488621 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:08Z","lastTransitionTime":"2026-03-20T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.591780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.591877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.591895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.591919 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.591940 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:08Z","lastTransitionTime":"2026-03-20T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.695435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.695525 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.695543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.695570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.695620 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:08Z","lastTransitionTime":"2026-03-20T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.798350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.798417 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.798429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.798470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.798482 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:08Z","lastTransitionTime":"2026-03-20T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.901535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.901598 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.901618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.901643 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:08 crc kubenswrapper[4764]: I0320 14:53:08.901662 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:08Z","lastTransitionTime":"2026-03-20T14:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.004324 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.004403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.004421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.004445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.004463 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:09Z","lastTransitionTime":"2026-03-20T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.107013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.107060 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.107078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.107100 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.107116 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:09Z","lastTransitionTime":"2026-03-20T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.151103 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.168300 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.191123 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.209712 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.209760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.209776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.209799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.209817 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:09Z","lastTransitionTime":"2026-03-20T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.228473 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:03Z\\\",\\\"message\\\":\\\"03.899586 6806 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:03.899596 6806 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:03.899694 6806 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.899839 6806 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 14:53:03.900279 6806 factory.go:656] Stopping watch factory\\\\nI0320 14:53:03.900344 6806 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.900526 6806 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.900845 6806 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.901107 6806 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.247417 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.263185 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.279946 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.292959 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.311243 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.311434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.311452 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.311461 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.311473 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.311482 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:09Z","lastTransitionTime":"2026-03-20T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.327391 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.346935 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.360214 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.376010 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.390541 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.406443 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.414114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.414153 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.414161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.414176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.414186 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:09Z","lastTransitionTime":"2026-03-20T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.435051 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:09Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.515931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.515994 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.516011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.516034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.516053 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:09Z","lastTransitionTime":"2026-03-20T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.619318 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.619376 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.619429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.619458 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.619479 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:09Z","lastTransitionTime":"2026-03-20T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.722649 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.722709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.722729 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.722752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.722770 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:09Z","lastTransitionTime":"2026-03-20T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.826064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.826128 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.826145 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.826172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.826198 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:09Z","lastTransitionTime":"2026-03-20T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.914548 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:09 crc kubenswrapper[4764]: E0320 14:53:09.914704 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:53:09 crc kubenswrapper[4764]: E0320 14:53:09.915134 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:53:41.915099252 +0000 UTC m=+143.531288421 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.929371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.929449 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.929468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.929491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:09 crc kubenswrapper[4764]: I0320 14:53:09.929508 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:09Z","lastTransitionTime":"2026-03-20T14:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.015580 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.015783 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.015827 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.015844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.015855 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:53:42.01578966 +0000 UTC m=+143.631978829 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.015871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.015902 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.016052 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.016111 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:53:42.016094469 +0000 UTC m=+143.632283628 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.015911 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.016285 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.016364 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.016444 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.016575 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.016596 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.016615 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.016638 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.016660 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 14:53:42.016645756 +0000 UTC m=+143.632834915 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.016670 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.016678 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.016689 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.016758 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs podName:4c881e2f-a84e-4621-9e1e-f2197d698a63 nodeName:}" failed. No retries permitted until 2026-03-20 14:53:42.016734499 +0000 UTC m=+143.632923658 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs") pod "network-metrics-daemon-fb2k7" (UID: "4c881e2f-a84e-4621-9e1e-f2197d698a63") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.016786 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 14:53:42.01677131 +0000 UTC m=+143.632960479 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.036602 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:10Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.041472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.041529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.041557 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.041582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.041598 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.060814 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:10Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.065680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.065735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.065754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.065777 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.065795 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.084405 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:10Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.088799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.088855 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.088873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.088897 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.088915 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.108114 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:10Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.112995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.113056 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.113074 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.113097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.113114 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.126104 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.126176 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.126199 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.126426 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.126532 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.126699 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.126799 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.126896 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.133765 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:10Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:10 crc kubenswrapper[4764]: E0320 14:53:10.133992 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.137105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.137153 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.137171 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.137194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.137212 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.239612 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.239709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.239768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.239793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.239845 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.342960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.343014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.343031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.343054 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.343072 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.445596 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.445661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.445678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.445703 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.445727 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.548987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.549047 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.549064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.549087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.549105 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.651713 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.651756 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.651772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.651795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.651812 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.755350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.755667 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.755847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.755999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.756175 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.859261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.859575 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.859621 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.859647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.859663 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.962878 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.962948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.962967 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.962993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:10 crc kubenswrapper[4764]: I0320 14:53:10.963010 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:10Z","lastTransitionTime":"2026-03-20T14:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.066434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.066813 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.066956 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.067099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.067238 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:11Z","lastTransitionTime":"2026-03-20T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.169833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.169874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.169885 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.169926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.169938 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:11Z","lastTransitionTime":"2026-03-20T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.273127 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.273177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.273193 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.273217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.273233 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:11Z","lastTransitionTime":"2026-03-20T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.375905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.375960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.375975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.375995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.376013 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:11Z","lastTransitionTime":"2026-03-20T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.479199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.479264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.479288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.479314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.479333 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:11Z","lastTransitionTime":"2026-03-20T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.583473 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.583539 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.583561 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.583589 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.583612 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:11Z","lastTransitionTime":"2026-03-20T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.686002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.686041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.686054 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.686070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.686080 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:11Z","lastTransitionTime":"2026-03-20T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.789136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.789232 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.789255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.789290 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.789314 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:11Z","lastTransitionTime":"2026-03-20T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.891827 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.891926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.891948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.891978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.892002 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:11Z","lastTransitionTime":"2026-03-20T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.995342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.995440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.995466 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.995493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:11 crc kubenswrapper[4764]: I0320 14:53:11.995516 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:11Z","lastTransitionTime":"2026-03-20T14:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.097985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.098049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.098073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.098101 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.098124 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:12Z","lastTransitionTime":"2026-03-20T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.125347 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.125445 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.125360 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:12 crc kubenswrapper[4764]: E0320 14:53:12.125559 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.125624 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:12 crc kubenswrapper[4764]: E0320 14:53:12.125759 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:12 crc kubenswrapper[4764]: E0320 14:53:12.125879 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:12 crc kubenswrapper[4764]: E0320 14:53:12.126043 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.201696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.201754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.201774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.201798 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.201818 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:12Z","lastTransitionTime":"2026-03-20T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.304298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.304354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.304373 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.304420 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.304438 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:12Z","lastTransitionTime":"2026-03-20T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.407849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.407905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.407923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.407947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.407966 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:12Z","lastTransitionTime":"2026-03-20T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.511076 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.511136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.511180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.511210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.511231 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:12Z","lastTransitionTime":"2026-03-20T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.614877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.614936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.614952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.614978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.614996 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:12Z","lastTransitionTime":"2026-03-20T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.717494 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.717597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.717622 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.717650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.717672 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:12Z","lastTransitionTime":"2026-03-20T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.820817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.820872 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.820893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.820922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.820939 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:12Z","lastTransitionTime":"2026-03-20T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.923868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.923926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.923943 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.923966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:12 crc kubenswrapper[4764]: I0320 14:53:12.923983 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:12Z","lastTransitionTime":"2026-03-20T14:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.027370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.027730 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.027869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.028044 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.028166 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:13Z","lastTransitionTime":"2026-03-20T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.131086 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.131140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.131157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.131179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.131196 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:13Z","lastTransitionTime":"2026-03-20T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.234888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.234974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.234991 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.235044 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.235062 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:13Z","lastTransitionTime":"2026-03-20T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.338274 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.338323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.338339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.338360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.338408 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:13Z","lastTransitionTime":"2026-03-20T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.446620 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.446690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.446709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.446733 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.446750 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:13Z","lastTransitionTime":"2026-03-20T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.549903 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.549970 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.549988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.550013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.550033 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:13Z","lastTransitionTime":"2026-03-20T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.653129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.653190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.653209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.653233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.653251 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:13Z","lastTransitionTime":"2026-03-20T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.755908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.756269 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.756444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.756606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.756910 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:13Z","lastTransitionTime":"2026-03-20T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.859692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.859793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.859819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.859909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.859938 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:13Z","lastTransitionTime":"2026-03-20T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.963476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.964579 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.964629 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.964697 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:13 crc kubenswrapper[4764]: I0320 14:53:13.964718 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:13Z","lastTransitionTime":"2026-03-20T14:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.067960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.068024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.068041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.068070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.068089 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:14Z","lastTransitionTime":"2026-03-20T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.125699 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.125823 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.125829 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:14 crc kubenswrapper[4764]: E0320 14:53:14.126011 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.126061 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:14 crc kubenswrapper[4764]: E0320 14:53:14.126116 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:14 crc kubenswrapper[4764]: E0320 14:53:14.126249 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:14 crc kubenswrapper[4764]: E0320 14:53:14.126428 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.171275 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.171373 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.171424 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.171454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.171477 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:14Z","lastTransitionTime":"2026-03-20T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.274371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.274477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.274545 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.274574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.274639 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:14Z","lastTransitionTime":"2026-03-20T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.377992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.378040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.378053 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.378068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.378080 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:14Z","lastTransitionTime":"2026-03-20T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.480586 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.480639 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.480656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.480678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.480695 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:14Z","lastTransitionTime":"2026-03-20T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.584335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.584952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.585129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.585287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.585517 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:14Z","lastTransitionTime":"2026-03-20T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.688463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.688510 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.688526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.688549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.688566 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:14Z","lastTransitionTime":"2026-03-20T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.792893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.793426 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.793611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.793823 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.794107 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:14Z","lastTransitionTime":"2026-03-20T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.897830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.898211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.898341 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.898474 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:14 crc kubenswrapper[4764]: I0320 14:53:14.898555 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:14Z","lastTransitionTime":"2026-03-20T14:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.001275 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.001358 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.001371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.001419 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.001432 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:15Z","lastTransitionTime":"2026-03-20T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.105140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.105225 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.105251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.105279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.105298 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:15Z","lastTransitionTime":"2026-03-20T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.126693 4764 scope.go:117] "RemoveContainer" containerID="ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.209271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.209349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.209370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.209424 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.209444 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:15Z","lastTransitionTime":"2026-03-20T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.313137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.313208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.313226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.313252 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.313272 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:15Z","lastTransitionTime":"2026-03-20T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.417107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.417180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.417202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.417231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.417250 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:15Z","lastTransitionTime":"2026-03-20T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.521688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.521761 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.521784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.521811 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.521831 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:15Z","lastTransitionTime":"2026-03-20T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.625625 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.625693 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.625706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.625731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.625747 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:15Z","lastTransitionTime":"2026-03-20T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.728788 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.728869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.728894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.728926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.728950 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:15Z","lastTransitionTime":"2026-03-20T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.831903 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.831960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.831976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.831994 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.832006 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:15Z","lastTransitionTime":"2026-03-20T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.851666 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.854980 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a"} Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.855445 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.881361 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:15Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.903197 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:15Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.921795 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:15Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.935181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.935239 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.935260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.935286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.935304 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:15Z","lastTransitionTime":"2026-03-20T14:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.941096 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:15Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.960034 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:15Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:15 crc kubenswrapper[4764]: I0320 14:53:15.990530 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:15Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.009002 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:16Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.026354 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:16Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.039146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.039218 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.039237 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.039262 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.039280 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:16Z","lastTransitionTime":"2026-03-20T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.045945 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:16Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.069305 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:16Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.083328 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:16Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.105865 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:16Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.126366 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:16 crc kubenswrapper[4764]: E0320 14:53:16.126686 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.127050 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:16 crc kubenswrapper[4764]: E0320 14:53:16.127154 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.127366 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:16 crc kubenswrapper[4764]: E0320 14:53:16.127525 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.127790 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:16 crc kubenswrapper[4764]: E0320 14:53:16.127977 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.136350 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:03Z\\\",\\\"message\\\":\\\"03.899586 6806 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:03.899596 6806 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:03.899694 6806 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.899839 6806 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 14:53:03.900279 6806 factory.go:656] Stopping watch factory\\\\nI0320 14:53:03.900344 6806 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.900526 6806 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.900845 6806 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.901107 6806 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:16Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.142446 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.142488 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.142500 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.142518 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.142544 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:16Z","lastTransitionTime":"2026-03-20T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.155295 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:16Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.171278 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:16Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.189203 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:16Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.246240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.246301 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.246318 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.246342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.246368 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:16Z","lastTransitionTime":"2026-03-20T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.349359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.349429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.349444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.349465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.349479 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:16Z","lastTransitionTime":"2026-03-20T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.452862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.452938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.452957 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.452990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.453008 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:16Z","lastTransitionTime":"2026-03-20T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.555779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.555847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.555864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.555888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.555906 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:16Z","lastTransitionTime":"2026-03-20T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.659340 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.659740 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.659757 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.659785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.659803 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:16Z","lastTransitionTime":"2026-03-20T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.763961 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.764035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.764053 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.764084 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.764103 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:16Z","lastTransitionTime":"2026-03-20T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.867414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.867470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.867489 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.867511 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.867529 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:16Z","lastTransitionTime":"2026-03-20T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.970669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.970730 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.970748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.970771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:16 crc kubenswrapper[4764]: I0320 14:53:16.970788 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:16Z","lastTransitionTime":"2026-03-20T14:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.073776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.073836 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.073861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.073891 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.073915 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:17Z","lastTransitionTime":"2026-03-20T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.177323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.177416 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.177434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.177459 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.177481 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:17Z","lastTransitionTime":"2026-03-20T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.281054 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.281114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.281130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.281159 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.281181 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:17Z","lastTransitionTime":"2026-03-20T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.385046 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.385101 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.385118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.385140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.385157 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:17Z","lastTransitionTime":"2026-03-20T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.488852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.488911 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.488927 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.488951 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.488969 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:17Z","lastTransitionTime":"2026-03-20T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.591985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.592059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.592078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.592102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.592120 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:17Z","lastTransitionTime":"2026-03-20T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.695430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.695516 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.695545 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.695574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.695597 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:17Z","lastTransitionTime":"2026-03-20T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.799641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.799723 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.799748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.799798 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.799821 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:17Z","lastTransitionTime":"2026-03-20T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.902796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.902849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.902866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.902888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:17 crc kubenswrapper[4764]: I0320 14:53:17.902904 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:17Z","lastTransitionTime":"2026-03-20T14:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.006605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.006675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.006694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.006719 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.006736 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:18Z","lastTransitionTime":"2026-03-20T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.110468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.110512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.110530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.110552 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.110569 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:18Z","lastTransitionTime":"2026-03-20T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.125793 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.125882 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:18 crc kubenswrapper[4764]: E0320 14:53:18.126010 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.125802 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:18 crc kubenswrapper[4764]: E0320 14:53:18.126307 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.126639 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:18 crc kubenswrapper[4764]: E0320 14:53:18.126776 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:18 crc kubenswrapper[4764]: E0320 14:53:18.127028 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.214244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.214300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.214318 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.214341 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.214359 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:18Z","lastTransitionTime":"2026-03-20T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.318537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.319135 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.319316 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.319505 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.319645 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:18Z","lastTransitionTime":"2026-03-20T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.423755 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.423813 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.423830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.423855 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.423874 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:18Z","lastTransitionTime":"2026-03-20T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.527978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.528030 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.528042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.528066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.528079 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:18Z","lastTransitionTime":"2026-03-20T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.631460 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.631538 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.631558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.631582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.631600 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:18Z","lastTransitionTime":"2026-03-20T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.734819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.734879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.734896 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.734920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.734938 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:18Z","lastTransitionTime":"2026-03-20T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.838767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.838827 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.838845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.838869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.838887 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:18Z","lastTransitionTime":"2026-03-20T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.942986 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.943376 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.943585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.943741 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:18 crc kubenswrapper[4764]: I0320 14:53:18.943885 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:18Z","lastTransitionTime":"2026-03-20T14:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.046984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.047047 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.047064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.047088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.047107 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:19Z","lastTransitionTime":"2026-03-20T14:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:19 crc kubenswrapper[4764]: E0320 14:53:19.147329 4764 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.150282 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.179658 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:03Z\\\",\\\"message\\\":\\\"03.899586 6806 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:03.899596 6806 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:03.899694 6806 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.899839 6806 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 14:53:03.900279 6806 factory.go:656] Stopping watch factory\\\\nI0320 14:53:03.900344 6806 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.900526 6806 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.900845 6806 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.901107 6806 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.200748 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.216368 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.233190 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.250526 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: E0320 14:53:19.266153 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.275329 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.299575 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.314610 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.334642 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.354936 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.372832 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.398129 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.414830 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.430054 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:19 crc kubenswrapper[4764]: I0320 14:53:19.442115 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:19Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.125180 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.125241 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.125197 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.125185 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:20 crc kubenswrapper[4764]: E0320 14:53:20.125416 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:20 crc kubenswrapper[4764]: E0320 14:53:20.125526 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:20 crc kubenswrapper[4764]: E0320 14:53:20.125883 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:20 crc kubenswrapper[4764]: E0320 14:53:20.126094 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.127481 4764 scope.go:117] "RemoveContainer" containerID="396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.441002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.441068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.441080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.441099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.441110 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:20Z","lastTransitionTime":"2026-03-20T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:20 crc kubenswrapper[4764]: E0320 14:53:20.459219 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:20Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.464228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.464266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.464278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.464296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.464311 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:20Z","lastTransitionTime":"2026-03-20T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:20 crc kubenswrapper[4764]: E0320 14:53:20.481514 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:20Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.485715 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.486125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.486141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.486162 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.486174 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:20Z","lastTransitionTime":"2026-03-20T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:20 crc kubenswrapper[4764]: E0320 14:53:20.505646 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:20Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.510069 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.510114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.510125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.510142 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.510153 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:20Z","lastTransitionTime":"2026-03-20T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:20 crc kubenswrapper[4764]: E0320 14:53:20.529908 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:20Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.534217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.534258 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.534276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.534300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.534319 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:20Z","lastTransitionTime":"2026-03-20T14:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:20 crc kubenswrapper[4764]: E0320 14:53:20.558474 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:20Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:20 crc kubenswrapper[4764]: E0320 14:53:20.558665 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.875866 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/1.log" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.879728 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2"} Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.880252 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.898784 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:20Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.917880 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:20Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.939956 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:20Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.961067 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:20Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:20 crc kubenswrapper[4764]: I0320 14:53:20.979805 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:20Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.000282 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:20Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.015529 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.036920 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.053468 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.081462 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:03Z\\\",\\\"message\\\":\\\"03.899586 6806 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:03.899596 6806 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:03.899694 6806 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.899839 6806 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 14:53:03.900279 6806 factory.go:656] Stopping watch factory\\\\nI0320 14:53:03.900344 6806 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.900526 6806 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.900845 6806 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.901107 6806 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.106197 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.123115 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.145831 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.165001 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.182231 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.201499 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.885207 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/2.log" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.886060 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/1.log" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.889358 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2" exitCode=1 Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.889460 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2"} Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.889565 4764 scope.go:117] "RemoveContainer" containerID="396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.890342 4764 scope.go:117] "RemoveContainer" containerID="eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2" Mar 20 14:53:21 crc kubenswrapper[4764]: E0320 14:53:21.890618 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.912492 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.937672 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.971935 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396bd1a721707ec0bb21b84f27c4ad119b85b3ee02b4fd1ffaec3338378cd41b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:03Z\\\",\\\"message\\\":\\\"03.899586 6806 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:03.899596 6806 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:03.899694 6806 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.899839 6806 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 14:53:03.900279 6806 factory.go:656] Stopping watch factory\\\\nI0320 14:53:03.900344 6806 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 14:53:03.900526 6806 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.900845 6806 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 14:53:03.901107 6806 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:21Z\\\",\\\"message\\\":\\\" Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144649 7034 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 14:53:21.144660 7034 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nI0320 14:53:21.144647 7034 services_controller.go:451] Built service openshift-etcd-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144703 7034 services_controller.go:452] Built service openshift-etcd-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0320 14:53:21.144711 7034 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe50\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:21 crc kubenswrapper[4764]: I0320 14:53:21.989187 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:21Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.015777 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.057790 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.078691 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.095424 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.110321 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.123146 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.125340 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.125366 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:22 crc kubenswrapper[4764]: E0320 14:53:22.125557 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.125609 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.125625 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:22 crc kubenswrapper[4764]: E0320 14:53:22.125762 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:22 crc kubenswrapper[4764]: E0320 14:53:22.125892 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:22 crc kubenswrapper[4764]: E0320 14:53:22.126015 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.136016 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.151536 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.163827 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.175076 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.196440 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.209205 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.896359 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/2.log" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.903048 4764 scope.go:117] "RemoveContainer" containerID="eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2" Mar 20 14:53:22 crc kubenswrapper[4764]: E0320 14:53:22.903316 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.936824 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.956264 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.974465 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:22 crc kubenswrapper[4764]: I0320 14:53:22.990648 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:22Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:23 crc kubenswrapper[4764]: I0320 14:53:23.013613 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:21Z\\\",\\\"message\\\":\\\" Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144649 7034 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 14:53:21.144660 7034 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nI0320 14:53:21.144647 7034 services_controller.go:451] Built service openshift-etcd-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144703 7034 services_controller.go:452] Built service openshift-etcd-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0320 14:53:21.144711 7034 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe50\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:23Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:23 crc kubenswrapper[4764]: I0320 14:53:23.035721 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:23Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:23 crc kubenswrapper[4764]: I0320 14:53:23.048955 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:23Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:23 crc kubenswrapper[4764]: I0320 14:53:23.069954 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:23Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:23 crc kubenswrapper[4764]: I0320 14:53:23.090083 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:23Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:23 crc kubenswrapper[4764]: I0320 14:53:23.104744 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:23Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:23 crc kubenswrapper[4764]: I0320 14:53:23.122609 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:23Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:23 crc kubenswrapper[4764]: I0320 14:53:23.137103 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:23Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:23 crc kubenswrapper[4764]: I0320 14:53:23.153601 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:23Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:23 crc kubenswrapper[4764]: I0320 14:53:23.172440 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:23Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:23 crc kubenswrapper[4764]: I0320 14:53:23.189604 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:23Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:23 crc kubenswrapper[4764]: I0320 14:53:23.209300 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:23Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:24 crc kubenswrapper[4764]: I0320 14:53:24.126086 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:24 crc kubenswrapper[4764]: I0320 14:53:24.126124 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:24 crc kubenswrapper[4764]: I0320 14:53:24.126203 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:24 crc kubenswrapper[4764]: E0320 14:53:24.126283 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:24 crc kubenswrapper[4764]: I0320 14:53:24.126297 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:24 crc kubenswrapper[4764]: E0320 14:53:24.126542 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:24 crc kubenswrapper[4764]: E0320 14:53:24.126672 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:24 crc kubenswrapper[4764]: E0320 14:53:24.126792 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:24 crc kubenswrapper[4764]: E0320 14:53:24.267338 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:53:25 crc kubenswrapper[4764]: I0320 14:53:25.140074 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 14:53:26 crc kubenswrapper[4764]: I0320 14:53:26.126024 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:26 crc kubenswrapper[4764]: I0320 14:53:26.126123 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:26 crc kubenswrapper[4764]: I0320 14:53:26.126189 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:26 crc kubenswrapper[4764]: I0320 14:53:26.126026 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:26 crc kubenswrapper[4764]: E0320 14:53:26.126435 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:26 crc kubenswrapper[4764]: E0320 14:53:26.126805 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:26 crc kubenswrapper[4764]: E0320 14:53:26.126923 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:26 crc kubenswrapper[4764]: E0320 14:53:26.127258 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:26 crc kubenswrapper[4764]: I0320 14:53:26.142304 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.125589 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.125611 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.125691 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.125766 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:28 crc kubenswrapper[4764]: E0320 14:53:28.125968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:28 crc kubenswrapper[4764]: E0320 14:53:28.126192 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:28 crc kubenswrapper[4764]: E0320 14:53:28.126502 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:28 crc kubenswrapper[4764]: E0320 14:53:28.126570 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.477904 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.511679 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.532074 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.550753 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.568901 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.590781 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.611545 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8eacbac-cab3-4632-b5b9-10585b7e3013\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07b2638b074959f3554a18ae97b2e3003af3363c035859fbfc11d0be25bd8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5d48409f7efcb41bb9e49779dd73c409e6c5724c3169e0e236c8432c8810a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e81607c883d0ae10f09df76916ee9f179522d335e04a674d5d91e8fa53e1493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.629254 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.655788 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.688026 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:21Z\\\",\\\"message\\\":\\\" Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144649 7034 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 14:53:21.144660 7034 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nI0320 14:53:21.144647 7034 services_controller.go:451] Built service openshift-etcd-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144703 7034 services_controller.go:452] Built service openshift-etcd-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0320 14:53:21.144711 7034 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe50\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.707970 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.726356 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.748834 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.769271 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f57fc6fc-808a-4582-bc7d-c0030311afea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70543c47e9e79071fcf3bf18832d56a6ce2744f4336d0a99fc50e84a2d22b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:51:51Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 14:51:21.551321 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 14:51:21.554692 1 observer_polling.go:159] Starting file observer\\\\nI0320 14:51:21.606337 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 14:51:21.610495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 14:51:51.839888 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a885928113c47ef88cb200361015553722b14e453a1fe597dc7cbbc701585878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://177b2306e32b949702cfcfda1dedf244c68b4fb5e84f47bfc1ea60c528c8a08c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.793140 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.813163 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.833451 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.854418 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:28 crc kubenswrapper[4764]: I0320 14:53:28.872409 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:28Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.155688 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8eacbac-cab3-4632-b5b9-10585b7e3013\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07b2638b074959f3554a18ae97b2e3003af3363c035859fbfc11d0be25bd8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5d48409f7efcb41bb9e49779dd73c409e6c5724c3169e0e236c8432c8810a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e81607c883d0ae10f09df76916ee9f179522d335e04a674d5d91e8fa53e1493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.171650 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.194656 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.225678 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:21Z\\\",\\\"message\\\":\\\" Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144649 7034 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 14:53:21.144660 7034 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nI0320 14:53:21.144647 7034 services_controller.go:451] Built service openshift-etcd-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144703 7034 services_controller.go:452] Built service openshift-etcd-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0320 14:53:21.144711 7034 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe50\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.248459 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: E0320 14:53:29.269575 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.275697 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.301173 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.320043 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.340438 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.358820 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.378307 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.391944 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.406196 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f57fc6fc-808a-4582-bc7d-c0030311afea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70543c47e9e79071fcf3bf18832d56a6ce2744f4336d0a99fc50e84a2d22b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:51:51Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 14:51:21.551321 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 14:51:21.554692 1 observer_polling.go:159] Starting file observer\\\\nI0320 14:51:21.606337 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 14:51:21.610495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 14:51:51.839888 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a885928113c47ef88cb200361015553722b14e453a1fe597dc7cbbc701585878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://177b2306e32b949702cfcfda1dedf244c68b4fb5e84f47bfc1ea60c528c8a08c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.421042 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.438135 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.454205 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.485990 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:29 crc kubenswrapper[4764]: I0320 14:53:29.504880 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:29Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.126063 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.126193 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:30 crc kubenswrapper[4764]: E0320 14:53:30.126310 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.126080 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.126222 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:30 crc kubenswrapper[4764]: E0320 14:53:30.126487 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:30 crc kubenswrapper[4764]: E0320 14:53:30.126683 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:30 crc kubenswrapper[4764]: E0320 14:53:30.126791 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.814685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.814740 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.814760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.814786 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.814806 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:30Z","lastTransitionTime":"2026-03-20T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:30 crc kubenswrapper[4764]: E0320 14:53:30.836580 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:30Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.842888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.842948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.842969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.842998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.843078 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:30Z","lastTransitionTime":"2026-03-20T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:30 crc kubenswrapper[4764]: E0320 14:53:30.865568 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:30Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.871079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.871130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.871148 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.871171 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.871190 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:30Z","lastTransitionTime":"2026-03-20T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:30 crc kubenswrapper[4764]: E0320 14:53:30.894031 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:30Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.899619 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.899655 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.899668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.899690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.899705 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:30Z","lastTransitionTime":"2026-03-20T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:30 crc kubenswrapper[4764]: E0320 14:53:30.919322 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:30Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.925176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.925261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.925274 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.925295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:30 crc kubenswrapper[4764]: I0320 14:53:30.925309 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:30Z","lastTransitionTime":"2026-03-20T14:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:30 crc kubenswrapper[4764]: E0320 14:53:30.947500 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:30Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:30 crc kubenswrapper[4764]: E0320 14:53:30.947734 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 14:53:32 crc kubenswrapper[4764]: I0320 14:53:32.125641 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:32 crc kubenswrapper[4764]: I0320 14:53:32.125714 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:32 crc kubenswrapper[4764]: I0320 14:53:32.125756 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:32 crc kubenswrapper[4764]: I0320 14:53:32.125828 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:32 crc kubenswrapper[4764]: E0320 14:53:32.125944 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:32 crc kubenswrapper[4764]: E0320 14:53:32.126179 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:32 crc kubenswrapper[4764]: E0320 14:53:32.126266 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:32 crc kubenswrapper[4764]: E0320 14:53:32.126429 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:34 crc kubenswrapper[4764]: I0320 14:53:34.126120 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:34 crc kubenswrapper[4764]: I0320 14:53:34.126120 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:34 crc kubenswrapper[4764]: I0320 14:53:34.126131 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:34 crc kubenswrapper[4764]: E0320 14:53:34.127520 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:34 crc kubenswrapper[4764]: E0320 14:53:34.127635 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:34 crc kubenswrapper[4764]: I0320 14:53:34.126174 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:34 crc kubenswrapper[4764]: E0320 14:53:34.127804 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:34 crc kubenswrapper[4764]: E0320 14:53:34.127974 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:34 crc kubenswrapper[4764]: E0320 14:53:34.270677 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:53:36 crc kubenswrapper[4764]: I0320 14:53:36.125987 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:36 crc kubenswrapper[4764]: I0320 14:53:36.126070 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:36 crc kubenswrapper[4764]: I0320 14:53:36.126143 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:36 crc kubenswrapper[4764]: E0320 14:53:36.126194 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:36 crc kubenswrapper[4764]: I0320 14:53:36.126253 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:36 crc kubenswrapper[4764]: E0320 14:53:36.126574 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:36 crc kubenswrapper[4764]: E0320 14:53:36.126699 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:36 crc kubenswrapper[4764]: E0320 14:53:36.127295 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:36 crc kubenswrapper[4764]: I0320 14:53:36.127756 4764 scope.go:117] "RemoveContainer" containerID="eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2" Mar 20 14:53:36 crc kubenswrapper[4764]: E0320 14:53:36.128109 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" Mar 20 14:53:37 crc kubenswrapper[4764]: I0320 14:53:37.142091 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 14:53:38 crc kubenswrapper[4764]: I0320 14:53:38.125323 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:38 crc kubenswrapper[4764]: I0320 14:53:38.125358 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:38 crc kubenswrapper[4764]: E0320 14:53:38.125536 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:38 crc kubenswrapper[4764]: I0320 14:53:38.125604 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:38 crc kubenswrapper[4764]: E0320 14:53:38.125794 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:38 crc kubenswrapper[4764]: E0320 14:53:38.125900 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:38 crc kubenswrapper[4764]: I0320 14:53:38.126257 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:38 crc kubenswrapper[4764]: E0320 14:53:38.126629 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.141484 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.159988 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.175935 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.190544 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.204852 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f57fc6fc-808a-4582-bc7d-c0030311afea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70543c47e9e79071fcf3bf18832d56a6ce2744f4336d0a99fc50e84a2d22b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:51:51Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 14:51:21.551321 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 14:51:21.554692 1 observer_polling.go:159] Starting file observer\\\\nI0320 14:51:21.606337 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 14:51:21.610495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 14:51:51.839888 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a885928113c47ef88cb200361015553722b14e453a1fe597dc7cbbc701585878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://177b2306e32b949702cfcfda1dedf244c68b4fb5e84f47bfc1ea60c528c8a08c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.220647 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.234593 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.251341 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: E0320 14:53:39.271923 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.282962 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.303048 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.321145 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8eacbac-cab3-4632-b5b9-10585b7e3013\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07b2638b074959f3554a18ae97b2e3003af3363c035859fbfc11d0be25bd8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5d48409f7efcb41bb9e49779dd73c409e6c5724c3169e0e236c8432c8810a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e81607c883d0ae10f09df76916ee9f179522d335e04a674d5d91e8fa53e1493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.336566 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.359084 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.389273 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:21Z\\\",\\\"message\\\":\\\" Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144649 7034 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 14:53:21.144660 7034 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nI0320 14:53:21.144647 7034 services_controller.go:451] Built service openshift-etcd-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144703 7034 services_controller.go:452] Built service openshift-etcd-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0320 14:53:21.144711 7034 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe50\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.405613 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4897c9d2-c4df-4203-9132-8ebf9375cbf2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0332bd896c2dc6d02397939e2edb7b192c381d5581f65b8780caa0fadd4493d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.426913 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.447223 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.464343 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:39 crc kubenswrapper[4764]: I0320 14:53:39.481470 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:39Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:40 crc kubenswrapper[4764]: I0320 14:53:40.125823 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:40 crc kubenswrapper[4764]: E0320 14:53:40.126113 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:40 crc kubenswrapper[4764]: I0320 14:53:40.125823 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:40 crc kubenswrapper[4764]: E0320 14:53:40.126297 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:40 crc kubenswrapper[4764]: I0320 14:53:40.125857 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:40 crc kubenswrapper[4764]: E0320 14:53:40.126481 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:40 crc kubenswrapper[4764]: I0320 14:53:40.125848 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:40 crc kubenswrapper[4764]: E0320 14:53:40.126759 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:40 crc kubenswrapper[4764]: I0320 14:53:40.975589 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4m5r_1f85a77d-475e-43c9-8181-093451bc058f/kube-multus/0.log" Mar 20 14:53:40 crc kubenswrapper[4764]: I0320 14:53:40.975638 4764 generic.go:334] "Generic (PLEG): container finished" podID="1f85a77d-475e-43c9-8181-093451bc058f" containerID="eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582" exitCode=1 Mar 20 14:53:40 crc kubenswrapper[4764]: I0320 14:53:40.975672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4m5r" event={"ID":"1f85a77d-475e-43c9-8181-093451bc058f","Type":"ContainerDied","Data":"eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582"} Mar 20 14:53:40 crc kubenswrapper[4764]: I0320 14:53:40.976041 4764 scope.go:117] "RemoveContainer" containerID="eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582" Mar 20 14:53:40 crc kubenswrapper[4764]: I0320 14:53:40.998352 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:40Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.015015 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.035138 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f57fc6fc-808a-4582-bc7d-c0030311afea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70543c47e9e79071fcf3bf18832d56a6ce2744f4336d0a99fc50e84a2d22b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:51:51Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 14:51:21.551321 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 14:51:21.554692 1 observer_polling.go:159] Starting file observer\\\\nI0320 14:51:21.606337 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 14:51:21.610495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 14:51:51.839888 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a885928113c47ef88cb200361015553722b14e453a1fe597dc7cbbc701585878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://177b2306e32b949702cfcfda1dedf244c68b4fb5e84f47bfc1ea60c528c8a08c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.056895 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.079169 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.098266 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.126297 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.146456 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.157995 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.166868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.166929 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.166949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.166972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.166989 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:41Z","lastTransitionTime":"2026-03-20T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.172230 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: E0320 14:53:41.187660 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.193327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.193403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.193423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.193444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.193459 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:41Z","lastTransitionTime":"2026-03-20T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.193589 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: E0320 14:53:41.216004 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.220710 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.220763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.220781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.220803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.220818 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:41Z","lastTransitionTime":"2026-03-20T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.226481 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:21Z\\\",\\\"message\\\":\\\" Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144649 7034 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 14:53:21.144660 7034 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nI0320 14:53:21.144647 7034 services_controller.go:451] Built service openshift-etcd-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144703 7034 services_controller.go:452] Built service openshift-etcd-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0320 14:53:21.144711 7034 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe50\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: E0320 14:53:41.238199 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.239702 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4897c9d2-c4df-4203-9132-8ebf9375cbf2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0332bd896c2dc6d02397939e2edb7b192c381d5581f65b8780caa0fadd4493d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.242298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.242326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.242339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.242357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.242370 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:41Z","lastTransitionTime":"2026-03-20T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:41 crc kubenswrapper[4764]: E0320 14:53:41.257398 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.259123 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.260414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.260447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.260461 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.260476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.260486 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:41Z","lastTransitionTime":"2026-03-20T14:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.275783 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8eacbac-cab3-4632-b5b9-10585b7e3013\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07b2638b074959f3554a18ae97b2e3003af3363c035859fbfc11d0be25bd8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5d48409f7efcb41bb9e49779dd73c409e6c5724c3169e0e236c8432c8810a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e81607c883d0ae10f09df76916ee9f179522d335e04a674d5d91e8fa53e1493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: E0320 14:53:41.276712 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: E0320 14:53:41.276873 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.288913 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.304561 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.319460 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.339059 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:40Z\\\",\\\"message\\\":\\\"2026-03-20T14:52:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e3b038c3-d4c4-4c00-b3c0-bf313ebbaad3\\\\n2026-03-20T14:52:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e3b038c3-d4c4-4c00-b3c0-bf313ebbaad3 to /host/opt/cni/bin/\\\\n2026-03-20T14:52:55Z [verbose] multus-daemon started\\\\n2026-03-20T14:52:55Z [verbose] Readiness Indicator file check\\\\n2026-03-20T14:53:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:41Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.986888 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4m5r_1f85a77d-475e-43c9-8181-093451bc058f/kube-multus/0.log" Mar 20 14:53:41 crc kubenswrapper[4764]: I0320 14:53:41.986987 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4m5r" event={"ID":"1f85a77d-475e-43c9-8181-093451bc058f","Type":"ContainerStarted","Data":"7a45deaf09c399dfadd97c92dbcb7cb9d94d21f2d39069e2f4b7de45135f4abc"} Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.009665 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.009841 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.009942 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:54:46.009916957 +0000 UTC m=+207.626106126 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.020684 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.039514 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.057466 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.074477 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.090682 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4897c9d2-c4df-4203-9132-8ebf9375cbf2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0332bd896c2dc6d02397939e2edb7b192c381d5581f65b8780caa0fadd4493d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.110558 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.110720 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.110762 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:46.110734963 +0000 UTC m=+207.726924132 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.110861 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.110889 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.110949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.110988 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:54:46.110943299 +0000 UTC m=+207.727132468 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.111025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.111115 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.111148 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.111166 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs podName:4c881e2f-a84e-4621-9e1e-f2197d698a63 nodeName:}" failed. No retries permitted until 2026-03-20 14:54:46.111150453 +0000 UTC m=+207.727339612 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs") pod "network-metrics-daemon-fb2k7" (UID: "4c881e2f-a84e-4621-9e1e-f2197d698a63") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.111170 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.111198 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.111242 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 14:54:46.111230385 +0000 UTC m=+207.727419554 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.111311 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.111338 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.111352 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.111431 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 14:54:46.11141197 +0000 UTC m=+207.727601119 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.111822 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.125691 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.125723 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.125730 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.125910 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.125957 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.126129 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.126247 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:42 crc kubenswrapper[4764]: E0320 14:53:42.126359 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.130260 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8eacbac-cab3-4632-b5b9-10585b7e3013\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07b2638b074959f3554a18ae97b2e3003af3363c035859fbfc11d0be25bd8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5d48409f7efcb41bb9e49779dd73c409e6c5724c3169e0e236c8432c8810a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e81607c883d0ae10f09df76916ee9f179522d335e04a674d5d91e8fa53e1493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.146269 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.161432 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.192232 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:21Z\\\",\\\"message\\\":\\\" Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144649 7034 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 14:53:21.144660 7034 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nI0320 14:53:21.144647 7034 services_controller.go:451] Built service openshift-etcd-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144703 7034 services_controller.go:452] Built service openshift-etcd-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0320 14:53:21.144711 7034 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe50\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.207201 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.217988 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.230979 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a45deaf09c399dfadd97c92dbcb7cb9d94d21f2d39069e2f4b7de45135f4abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:40Z\\\",\\\"message\\\":\\\"2026-03-20T14:52:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e3b038c3-d4c4-4c00-b3c0-bf313ebbaad3\\\\n2026-03-20T14:52:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e3b038c3-d4c4-4c00-b3c0-bf313ebbaad3 to /host/opt/cni/bin/\\\\n2026-03-20T14:52:55Z [verbose] multus-daemon started\\\\n2026-03-20T14:52:55Z [verbose] Readiness Indicator file check\\\\n2026-03-20T14:53:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.248575 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f57fc6fc-808a-4582-bc7d-c0030311afea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70543c47e9e79071fcf3bf18832d56a6ce2744f4336d0a99fc50e84a2d22b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:51:51Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 14:51:21.551321 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 14:51:21.554692 1 observer_polling.go:159] Starting file observer\\\\nI0320 14:51:21.606337 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 14:51:21.610495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 14:51:51.839888 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a885928113c47ef88cb200361015553722b14e453a1fe597dc7cbbc701585878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://177b2306e32b949702cfcfda1dedf244c68b4fb5e84f47bfc1ea60c528c8a08c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.264365 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.280203 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.293117 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.308415 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:42 crc kubenswrapper[4764]: I0320 14:53:42.319589 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:42Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:44 crc kubenswrapper[4764]: I0320 14:53:44.125705 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:44 crc kubenswrapper[4764]: I0320 14:53:44.125705 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:44 crc kubenswrapper[4764]: E0320 14:53:44.126732 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:44 crc kubenswrapper[4764]: I0320 14:53:44.125740 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:44 crc kubenswrapper[4764]: E0320 14:53:44.126818 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:44 crc kubenswrapper[4764]: I0320 14:53:44.125725 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:44 crc kubenswrapper[4764]: E0320 14:53:44.126894 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:44 crc kubenswrapper[4764]: E0320 14:53:44.126988 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:44 crc kubenswrapper[4764]: E0320 14:53:44.273201 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:53:46 crc kubenswrapper[4764]: I0320 14:53:46.126233 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:46 crc kubenswrapper[4764]: I0320 14:53:46.126233 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:46 crc kubenswrapper[4764]: I0320 14:53:46.126454 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:46 crc kubenswrapper[4764]: E0320 14:53:46.126751 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:46 crc kubenswrapper[4764]: E0320 14:53:46.126846 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:46 crc kubenswrapper[4764]: E0320 14:53:46.126953 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:46 crc kubenswrapper[4764]: I0320 14:53:46.127523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:46 crc kubenswrapper[4764]: E0320 14:53:46.127841 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:47 crc kubenswrapper[4764]: I0320 14:53:47.127609 4764 scope.go:117] "RemoveContainer" containerID="eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.010570 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/2.log" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.013925 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43"} Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.014331 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.035804 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f57fc6fc-808a-4582-bc7d-c0030311afea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70543c47e9e79071fcf3bf18832d56a6ce2744f4336d0a99fc50e84a2d22b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:51:51Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 14:51:21.551321 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 14:51:21.554692 1 observer_polling.go:159] Starting file observer\\\\nI0320 14:51:21.606337 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 14:51:21.610495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 14:51:51.839888 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a885928113c47ef88cb200361015553722b14e453a1fe597dc7cbbc701585878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://177b2306e32b949702cfcfda1dedf244c68b4fb5e84f47bfc1ea60c528c8a08c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.056564 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.079469 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.102046 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.124426 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.125472 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.125531 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.125566 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:48 crc kubenswrapper[4764]: E0320 14:53:48.125670 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.125687 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:48 crc kubenswrapper[4764]: E0320 14:53:48.125868 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:48 crc kubenswrapper[4764]: E0320 14:53:48.125957 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:48 crc kubenswrapper[4764]: E0320 14:53:48.126108 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.142149 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.176456 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.198462 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.213693 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.225253 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.236759 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4897c9d2-c4df-4203-9132-8ebf9375cbf2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0332bd896c2dc6d02397939e2edb7b192c381d5581f65b8780caa0fadd4493d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.257190 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.274029 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8eacbac-cab3-4632-b5b9-10585b7e3013\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07b2638b074959f3554a18ae97b2e3003af3363c035859fbfc11d0be25bd8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5d48409f7efcb41bb9e49779dd73c409e6c5724c3169e0e236c8432c8810a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e81607c883d0ae10f09df76916ee9f179522d335e04a674d5d91e8fa53e1493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.287505 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.310640 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.342763 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:21Z\\\",\\\"message\\\":\\\" Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144649 7034 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 14:53:21.144660 7034 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nI0320 14:53:21.144647 7034 services_controller.go:451] Built service openshift-etcd-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144703 7034 services_controller.go:452] Built service openshift-etcd-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0320 14:53:21.144711 7034 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe50\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.360246 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.374857 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:48 crc kubenswrapper[4764]: I0320 14:53:48.390207 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a45deaf09c399dfadd97c92dbcb7cb9d94d21f2d39069e2f4b7de45135f4abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:40Z\\\",\\\"message\\\":\\\"2026-03-20T14:52:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e3b038c3-d4c4-4c00-b3c0-bf313ebbaad3\\\\n2026-03-20T14:52:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e3b038c3-d4c4-4c00-b3c0-bf313ebbaad3 to /host/opt/cni/bin/\\\\n2026-03-20T14:52:55Z [verbose] multus-daemon started\\\\n2026-03-20T14:52:55Z [verbose] Readiness Indicator file check\\\\n2026-03-20T14:53:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:48Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.020467 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/3.log" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.021569 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/2.log" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.024931 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" exitCode=1 Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.024969 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43"} Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.025019 4764 scope.go:117] "RemoveContainer" containerID="eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.026074 4764 scope.go:117] "RemoveContainer" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" Mar 20 14:53:49 crc kubenswrapper[4764]: E0320 14:53:49.026337 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.051955 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:21Z\\\",\\\"message\\\":\\\" Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144649 7034 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 14:53:21.144660 7034 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nI0320 14:53:21.144647 7034 services_controller.go:451] Built service openshift-etcd-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144703 7034 services_controller.go:452] Built service openshift-etcd-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0320 14:53:21.144711 7034 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe50\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:48Z\\\",\\\"message\\\":\\\" 1 for removal\\\\nI0320 14:53:48.160847 7304 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 14:53:48.160868 7304 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 14:53:48.160890 7304 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 14:53:48.160890 7304 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 14:53:48.160898 7304 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 14:53:48.160919 7304 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 14:53:48.160936 7304 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 14:53:48.160938 7304 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 14:53:48.160948 7304 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:48.160950 7304 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 14:53:48.160964 7304 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 14:53:48.160984 7304 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 14:53:48.160997 7304 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 14:53:48.161010 7304 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:48.161060 7304 factory.go:656] Stopping watch factory\\\\nI0320 14:53:48.161083 7304 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.067668 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4897c9d2-c4df-4203-9132-8ebf9375cbf2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0332bd896c2dc6d02397939e2edb7b192c381d5581f65b8780caa0fadd4493d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.082867 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.094885 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8eacbac-cab3-4632-b5b9-10585b7e3013\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07b2638b074959f3554a18ae97b2e3003af3363c035859fbfc11d0be25bd8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5d48409f7efcb41bb9e49779dd73c409e6c5724c3169e0e236c8432c8810a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e81607c883d0ae10f09df76916ee9f179522d335e04a674d5d91e8fa53e1493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.106264 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.122751 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.139113 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.152844 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.170811 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a45deaf09c399dfadd97c92dbcb7cb9d94d21f2d39069e2f4b7de45135f4abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:40Z\\\",\\\"message\\\":\\\"2026-03-20T14:52:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e3b038c3-d4c4-4c00-b3c0-bf313ebbaad3\\\\n2026-03-20T14:52:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e3b038c3-d4c4-4c00-b3c0-bf313ebbaad3 to /host/opt/cni/bin/\\\\n2026-03-20T14:52:55Z [verbose] multus-daemon started\\\\n2026-03-20T14:52:55Z [verbose] Readiness Indicator file check\\\\n2026-03-20T14:53:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.182666 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.198768 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f57fc6fc-808a-4582-bc7d-c0030311afea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70543c47e9e79071fcf3bf18832d56a6ce2744f4336d0a99fc50e84a2d22b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:51:51Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 14:51:21.551321 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 14:51:21.554692 1 observer_polling.go:159] Starting file observer\\\\nI0320 14:51:21.606337 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 14:51:21.610495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 14:51:51.839888 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a885928113c47ef88cb200361015553722b14e453a1fe597dc7cbbc701585878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://177b2306e32b949702cfcfda1dedf244c68b4fb5e84f47bfc1ea60c528c8a08c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.218519 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.230440 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.250612 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.268427 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: E0320 14:53:49.274420 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.289132 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.309550 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.329699 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.346676 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.361270 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4897c9d2-c4df-4203-9132-8ebf9375cbf2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0332bd896c2dc6d02397939e2edb7b192c381d5581f65b8780caa0fadd4493d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.377424 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.392692 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8eacbac-cab3-4632-b5b9-10585b7e3013\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07b2638b074959f3554a18ae97b2e3003af3363c035859fbfc11d0be25bd8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5d48409f7efcb41bb9e49779dd73c409e6c5724c3169e0e236c8432c8810a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e81607c883d0ae10f09df76916ee9f179522d335e04a674d5d91e8fa53e1493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.407060 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.428958 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.457997 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb3794fe77342438c2b0113e08999240fc5d615a985fd7ba6f6d6076eba8b3f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:21Z\\\",\\\"message\\\":\\\" Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144649 7034 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 14:53:21.144660 7034 services_controller.go:356] Processing sync for service openshift-marketplace/redhat-marketplace for network=default\\\\nI0320 14:53:21.144647 7034 services_controller.go:451] Built service openshift-etcd-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 14:53:21.144703 7034 services_controller.go:452] Built service openshift-etcd-operator/metrics per-node LB for network=default: []services.LB{}\\\\nF0320 14:53:21.144711 7034 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: handler {0x1fe50\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:48Z\\\",\\\"message\\\":\\\" 1 for removal\\\\nI0320 14:53:48.160847 7304 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 14:53:48.160868 7304 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 14:53:48.160890 7304 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 14:53:48.160890 7304 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 14:53:48.160898 7304 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 14:53:48.160919 7304 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 14:53:48.160936 7304 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 14:53:48.160938 7304 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 14:53:48.160948 7304 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:48.160950 7304 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 14:53:48.160964 7304 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 14:53:48.160984 7304 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 14:53:48.160997 7304 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 14:53:48.161010 7304 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:48.161060 7304 factory.go:656] Stopping watch factory\\\\nI0320 14:53:48.161083 7304 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.470272 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.482996 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.502810 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a45deaf09c399dfadd97c92dbcb7cb9d94d21f2d39069e2f4b7de45135f4abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:40Z\\\",\\\"message\\\":\\\"2026-03-20T14:52:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e3b038c3-d4c4-4c00-b3c0-bf313ebbaad3\\\\n2026-03-20T14:52:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e3b038c3-d4c4-4c00-b3c0-bf313ebbaad3 to /host/opt/cni/bin/\\\\n2026-03-20T14:52:55Z [verbose] multus-daemon started\\\\n2026-03-20T14:52:55Z [verbose] Readiness Indicator file check\\\\n2026-03-20T14:53:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.519355 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f57fc6fc-808a-4582-bc7d-c0030311afea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70543c47e9e79071fcf3bf18832d56a6ce2744f4336d0a99fc50e84a2d22b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:51:51Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 14:51:21.551321 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 14:51:21.554692 1 observer_polling.go:159] Starting file observer\\\\nI0320 14:51:21.606337 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 14:51:21.610495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 14:51:51.839888 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a885928113c47ef88cb200361015553722b14e453a1fe597dc7cbbc701585878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://177b2306e32b949702cfcfda1dedf244c68b4fb5e84f47bfc1ea60c528c8a08c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.534082 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.552762 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.570118 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.585286 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.598105 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.617364 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.633595 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.650698 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:49 crc kubenswrapper[4764]: I0320 14:53:49.666092 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:49Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.031488 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/3.log" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.037245 4764 scope.go:117] "RemoveContainer" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" Mar 20 14:53:50 crc kubenswrapper[4764]: E0320 14:53:50.037661 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.059288 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d1b443188c5b78ed3c30f436332e736c2ba5695b69987c3283ca0e338d94fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4118fa03993ef35e204558dde1e3acfe320cf8a947db7d72aa92c936fed1f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.076011 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8nwvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db133590-6855-4e67-92cd-353b342f66fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e0ab32113316ec6f26aa7772c3c15020db45268c32f36ce7db95ef35a41dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2sx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8nwvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.096351 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f57fc6fc-808a-4582-bc7d-c0030311afea\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70543c47e9e79071fcf3bf18832d56a6ce2744f4336d0a99fc50e84a2d22b99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cc317551507ae7b977b87eafd9dd833775b590b144f5b2e27861816762e4e64\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:51:51Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 14:51:21.551321 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 14:51:21.554692 1 observer_polling.go:159] Starting file observer\\\\nI0320 14:51:21.606337 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 14:51:21.610495 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 14:51:51.839888 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a885928113c47ef88cb200361015553722b14e453a1fe597dc7cbbc701585878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://177b2306e32b949702cfcfda1dedf244c68b4fb5e84f47bfc1ea60c528c8a08c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.119467 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e7a40d6eeaed9b964ddac2559ade048baa110764eea89af330cc66955858e19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.126085 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.126138 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.126138 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.126104 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:50 crc kubenswrapper[4764]: E0320 14:53:50.126293 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:50 crc kubenswrapper[4764]: E0320 14:53:50.126446 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:50 crc kubenswrapper[4764]: E0320 14:53:50.126550 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:50 crc kubenswrapper[4764]: E0320 14:53:50.126627 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.141200 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.162272 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.200237 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8090bac4-1078-4c83-aede-09ff50814609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d36ea82623ef2c3f1da1d3cfa05f3987dd8d28dfcd5eb12e306423d335de3cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e1378dc87462162831f451ec9b152a76a04b7bd72172481ea967b7b8282ae54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce087845c6d2e256b91cd44bb67da93fc6e74ca36324f55a7f32307385b652c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89640f4a4d31612bc19532a65717e058d64cef36e66610f54b90edd2727a755e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdc66ffe35fc411767c78e7eee3196daa39c5f674624e80b59b26973cc6f2ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37763fa02d92f2720b9561a86f216a29ceae19e9d2b0cffc10e7bf2cc4a3eb1a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79be229a304bbcdb9b3465d790050e6035c879bf00e8f9b88f31e0cbe9d3de9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6e228f2da396e332911248fd209f66894609eaf4e283859dd7d8521ef8dbe97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.218311 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.236999 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f81c61d2b9d8d307118da7750440e179496e1495f7dc4eedd45046c9c29d6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.252989 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c881e2f-a84e-4621-9e1e-f2197d698a63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcnvw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fb2k7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.275905 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-279gq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07de6dd3-cfb3-49f7-9ac3-6c3a522ff349\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e11513296ee788838020c340831b4ddfea9016af5f70fa38d771b16b8df18e45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c87846436f7e8b2341f9f2646cf913e894123112d89138f474767e892be19062\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ec950f4ff0b6eee810fca6350331db95ba0d2070c8f77b2e91447bb52c68b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29c914a43f7d9020faa996251a765f21788377ddc15ae4c0a20ae7f94309aed8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d7c3d0739f076d64f19c8eba5ba2ee3e645ba4edc6202c5c17b38cdefec9b3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4762f24d97291a3aafd9046a441054a65672ca5ac382f3afe783d13982857a55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02447330eff4d7d2204092579497592be0d11a0b628941f5ff46924c808d27b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:53:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q82f6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-279gq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.306774 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2a6c163-0457-4626-9bbb-5628a5155673\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:48Z\\\",\\\"message\\\":\\\" 1 for removal\\\\nI0320 14:53:48.160847 7304 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 14:53:48.160868 7304 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 14:53:48.160890 7304 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 14:53:48.160890 7304 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 14:53:48.160898 7304 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 14:53:48.160919 7304 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 14:53:48.160936 7304 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 14:53:48.160938 7304 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 14:53:48.160948 7304 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 14:53:48.160950 7304 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 14:53:48.160964 7304 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 14:53:48.160984 7304 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 14:53:48.160997 7304 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 14:53:48.161010 7304 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 14:53:48.161060 7304 factory.go:656] Stopping watch factory\\\\nI0320 14:53:48.161083 7304 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:53:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s727g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p5lds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.323517 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4897c9d2-c4df-4203-9132-8ebf9375cbf2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0332bd896c2dc6d02397939e2edb7b192c381d5581f65b8780caa0fadd4493d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beefcdebf7d64fe7e59a628781d0a4571e114698aa399ad032ac5ed39d437ab7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.343943 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa943b85-f686-4d41-8822-f29c0a6defdf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T14:52:30Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 14:52:29.949684 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 14:52:29.949998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 14:52:29.950997 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1012451488/tls.crt::/tmp/serving-cert-1012451488/tls.key\\\\\\\"\\\\nI0320 14:52:30.194508 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 14:52:30.199061 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 14:52:30.199105 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 14:52:30.199149 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 14:52:30.199189 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 14:52:30.209300 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 14:52:30.209332 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 14:52:30.209349 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 14:52:30.209432 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 14:52:30.209436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 14:52:30.209440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 14:52:30.209445 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 14:52:30.212243 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.362814 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8eacbac-cab3-4632-b5b9-10585b7e3013\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:51:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb07b2638b074959f3554a18ae97b2e3003af3363c035859fbfc11d0be25bd8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5d48409f7efcb41bb9e49779dd73c409e6c5724c3169e0e236c8432c8810a18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e81607c883d0ae10f09df76916ee9f179522d335e04a674d5d91e8fa53e1493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:51:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9fc482cc7d10e98f48d29710d6215961f98097ea074c5e397fcc3cbb7764d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T14:51:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T14:51:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:51:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.377397 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7kvvv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cf737ac-eb6b-499e-aa94-a37f8ced743b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750c2cf0fa65e654fd03d428926e51ec43bb6fd998d519908902299bebab789f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h49t2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7kvvv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.394117 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cf5cd911-963e-480f-8bc2-6be581e6d9e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb9736c640afdd7e758a1915b5a6d0350031b25ca349b7f635c56833cf7f92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-skbfv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wln5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.409607 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84d09626-92c9-4c82-8dac-959885a658ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f247ba694009e6121205bfcb9f9fe570178cdd4626c2899fbec3435ca67d8582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3263342dcc5dd9e9a80210fb2852c16786498ec2e1a53dd1427f5a008ba4e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:52:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kpd8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wczhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:50 crc kubenswrapper[4764]: I0320 14:53:50.444781 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4m5r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1f85a77d-475e-43c9-8181-093451bc058f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:52:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a45deaf09c399dfadd97c92dbcb7cb9d94d21f2d39069e2f4b7de45135f4abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T14:53:40Z\\\",\\\"message\\\":\\\"2026-03-20T14:52:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e3b038c3-d4c4-4c00-b3c0-bf313ebbaad3\\\\n2026-03-20T14:52:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e3b038c3-d4c4-4c00-b3c0-bf313ebbaad3 to /host/opt/cni/bin/\\\\n2026-03-20T14:52:55Z [verbose] multus-daemon started\\\\n2026-03-20T14:52:55Z [verbose] Readiness Indicator file check\\\\n2026-03-20T14:53:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T14:52:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T14:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht7ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T14:52:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4m5r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:50Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.506564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.506658 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.506678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.506743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.506761 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:51Z","lastTransitionTime":"2026-03-20T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:51 crc kubenswrapper[4764]: E0320 14:53:51.528478 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:51Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.534227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.534298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.534317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.534346 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.534364 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:51Z","lastTransitionTime":"2026-03-20T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:51 crc kubenswrapper[4764]: E0320 14:53:51.554265 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:51Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.558971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.559018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.559034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.559057 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.559074 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:51Z","lastTransitionTime":"2026-03-20T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:51 crc kubenswrapper[4764]: E0320 14:53:51.578746 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:51Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.584085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.584153 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.584174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.584201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.584224 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:51Z","lastTransitionTime":"2026-03-20T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:51 crc kubenswrapper[4764]: E0320 14:53:51.603515 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:51Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.614249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.614308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.614327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.614354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:53:51 crc kubenswrapper[4764]: I0320 14:53:51.614373 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:53:51Z","lastTransitionTime":"2026-03-20T14:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:53:51 crc kubenswrapper[4764]: E0320 14:53:51.634165 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T14:53:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0eabb4fb-5c38-457c-9d72-ffd7b7b559d0\\\",\\\"systemUUID\\\":\\\"692a0227-1a43-4617-b4d8-dc30f2b9fadb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T14:53:51Z is after 2025-08-24T17:21:41Z" Mar 20 14:53:51 crc kubenswrapper[4764]: E0320 14:53:51.634412 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 14:53:52 crc kubenswrapper[4764]: I0320 14:53:52.125880 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:52 crc kubenswrapper[4764]: I0320 14:53:52.125894 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:52 crc kubenswrapper[4764]: I0320 14:53:52.126000 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:52 crc kubenswrapper[4764]: E0320 14:53:52.126189 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:52 crc kubenswrapper[4764]: I0320 14:53:52.126257 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:52 crc kubenswrapper[4764]: E0320 14:53:52.126409 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:52 crc kubenswrapper[4764]: E0320 14:53:52.126519 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:52 crc kubenswrapper[4764]: E0320 14:53:52.126600 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:54 crc kubenswrapper[4764]: I0320 14:53:54.125644 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:54 crc kubenswrapper[4764]: I0320 14:53:54.125645 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:54 crc kubenswrapper[4764]: I0320 14:53:54.125878 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:54 crc kubenswrapper[4764]: I0320 14:53:54.125914 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:54 crc kubenswrapper[4764]: E0320 14:53:54.126044 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:54 crc kubenswrapper[4764]: E0320 14:53:54.126119 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:54 crc kubenswrapper[4764]: E0320 14:53:54.126213 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:54 crc kubenswrapper[4764]: E0320 14:53:54.126428 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:54 crc kubenswrapper[4764]: E0320 14:53:54.276556 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:53:56 crc kubenswrapper[4764]: I0320 14:53:56.125370 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:56 crc kubenswrapper[4764]: I0320 14:53:56.125534 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:56 crc kubenswrapper[4764]: I0320 14:53:56.125734 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:56 crc kubenswrapper[4764]: E0320 14:53:56.125718 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:56 crc kubenswrapper[4764]: I0320 14:53:56.125833 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:56 crc kubenswrapper[4764]: E0320 14:53:56.125914 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:56 crc kubenswrapper[4764]: E0320 14:53:56.126047 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:56 crc kubenswrapper[4764]: E0320 14:53:56.126227 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:58 crc kubenswrapper[4764]: I0320 14:53:58.125207 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:53:58 crc kubenswrapper[4764]: I0320 14:53:58.125207 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:53:58 crc kubenswrapper[4764]: E0320 14:53:58.125852 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:53:58 crc kubenswrapper[4764]: I0320 14:53:58.125447 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:53:58 crc kubenswrapper[4764]: E0320 14:53:58.125974 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:53:58 crc kubenswrapper[4764]: I0320 14:53:58.125320 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:53:58 crc kubenswrapper[4764]: E0320 14:53:58.126154 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:53:58 crc kubenswrapper[4764]: E0320 14:53:58.126667 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:53:59 crc kubenswrapper[4764]: I0320 14:53:59.154990 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podStartSLOduration=119.15493983 podStartE2EDuration="1m59.15493983s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:53:59.153713109 +0000 UTC m=+160.769902258" watchObservedRunningTime="2026-03-20 14:53:59.15493983 +0000 UTC m=+160.771128959" Mar 20 14:53:59 crc kubenswrapper[4764]: I0320 14:53:59.194698 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wczhh" podStartSLOduration=118.194658609 podStartE2EDuration="1m58.194658609s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:53:59.172231214 +0000 UTC m=+160.788420373" watchObservedRunningTime="2026-03-20 14:53:59.194658609 +0000 UTC m=+160.810847778" Mar 20 14:53:59 crc kubenswrapper[4764]: I0320 14:53:59.212526 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d4m5r" podStartSLOduration=119.212501297 podStartE2EDuration="1m59.212501297s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:53:59.194465674 +0000 UTC m=+160.810654833" watchObservedRunningTime="2026-03-20 14:53:59.212501297 +0000 UTC m=+160.828690466" Mar 20 14:53:59 crc kubenswrapper[4764]: I0320 14:53:59.261175 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8nwvm" podStartSLOduration=119.261144385 podStartE2EDuration="1m59.261144385s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:53:59.242791554 +0000 UTC m=+160.858980693" watchObservedRunningTime="2026-03-20 14:53:59.261144385 +0000 UTC m=+160.877333554" Mar 20 14:53:59 crc kubenswrapper[4764]: I0320 14:53:59.261511 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=34.261504884 podStartE2EDuration="34.261504884s" podCreationTimestamp="2026-03-20 14:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:53:59.260297014 +0000 UTC m=+160.876486183" watchObservedRunningTime="2026-03-20 14:53:59.261504884 +0000 UTC m=+160.877694053" Mar 20 14:53:59 crc kubenswrapper[4764]: E0320 14:53:59.277334 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:53:59 crc kubenswrapper[4764]: I0320 14:53:59.348927 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=54.348900426 podStartE2EDuration="54.348900426s" podCreationTimestamp="2026-03-20 14:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:53:59.348420184 +0000 UTC m=+160.964609323" watchObservedRunningTime="2026-03-20 14:53:59.348900426 +0000 UTC m=+160.965089585" Mar 20 14:53:59 crc kubenswrapper[4764]: I0320 14:53:59.395600 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7kvvv" podStartSLOduration=119.395576454 podStartE2EDuration="1m59.395576454s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:53:59.394488836 +0000 UTC m=+161.010677975" watchObservedRunningTime="2026-03-20 14:53:59.395576454 +0000 UTC m=+161.011765623" Mar 20 14:53:59 crc kubenswrapper[4764]: I0320 14:53:59.464640 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-279gq" podStartSLOduration=118.464610835 podStartE2EDuration="1m58.464610835s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:53:59.429842673 +0000 UTC m=+161.046031832" watchObservedRunningTime="2026-03-20 14:53:59.464610835 +0000 UTC m=+161.080799964" Mar 20 14:53:59 crc kubenswrapper[4764]: I0320 14:53:59.482105 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=22.482076924 podStartE2EDuration="22.482076924s" podCreationTimestamp="2026-03-20 14:53:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:53:59.481218001 +0000 UTC m=+161.097407160" watchObservedRunningTime="2026-03-20 14:53:59.482076924 +0000 UTC m=+161.098266093" Mar 20 14:53:59 crc kubenswrapper[4764]: I0320 14:53:59.502264 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.50224536 podStartE2EDuration="1m21.50224536s" podCreationTimestamp="2026-03-20 14:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:53:59.501327848 +0000 UTC m=+161.117517017" watchObservedRunningTime="2026-03-20 14:53:59.50224536 +0000 UTC m=+161.118434529" Mar 20 14:53:59 crc kubenswrapper[4764]: I0320 14:53:59.517870 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=33.517836631 podStartE2EDuration="33.517836631s" podCreationTimestamp="2026-03-20 14:53:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:53:59.517630696 +0000 UTC m=+161.133819835" watchObservedRunningTime="2026-03-20 14:53:59.517836631 +0000 UTC m=+161.134025800" Mar 20 14:54:00 crc kubenswrapper[4764]: I0320 14:54:00.126077 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:00 crc kubenswrapper[4764]: E0320 14:54:00.126268 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:00 crc kubenswrapper[4764]: I0320 14:54:00.126356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:00 crc kubenswrapper[4764]: I0320 14:54:00.126361 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:00 crc kubenswrapper[4764]: E0320 14:54:00.126631 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:00 crc kubenswrapper[4764]: E0320 14:54:00.126805 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:00 crc kubenswrapper[4764]: I0320 14:54:00.127174 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:00 crc kubenswrapper[4764]: E0320 14:54:00.127362 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.126729 4764 scope.go:117] "RemoveContainer" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" Mar 20 14:54:01 crc kubenswrapper[4764]: E0320 14:54:01.127001 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.736973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.737053 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.737073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.737098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.737119 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T14:54:01Z","lastTransitionTime":"2026-03-20T14:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.805211 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs"] Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.805842 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.808930 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.809934 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.810427 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.815144 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.942467 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b0428029-854b-4cb8-8c42-7cf80d74ca61-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.942590 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b0428029-854b-4cb8-8c42-7cf80d74ca61-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.942682 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0428029-854b-4cb8-8c42-7cf80d74ca61-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.942782 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0428029-854b-4cb8-8c42-7cf80d74ca61-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:01 crc kubenswrapper[4764]: I0320 14:54:01.942817 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0428029-854b-4cb8-8c42-7cf80d74ca61-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.044643 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0428029-854b-4cb8-8c42-7cf80d74ca61-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.044886 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0428029-854b-4cb8-8c42-7cf80d74ca61-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.044930 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0428029-854b-4cb8-8c42-7cf80d74ca61-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.044988 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b0428029-854b-4cb8-8c42-7cf80d74ca61-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.045041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b0428029-854b-4cb8-8c42-7cf80d74ca61-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.045160 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b0428029-854b-4cb8-8c42-7cf80d74ca61-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.045242 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b0428029-854b-4cb8-8c42-7cf80d74ca61-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.047048 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0428029-854b-4cb8-8c42-7cf80d74ca61-service-ca\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.054024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0428029-854b-4cb8-8c42-7cf80d74ca61-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.073515 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0428029-854b-4cb8-8c42-7cf80d74ca61-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-nt8zs\" (UID: \"b0428029-854b-4cb8-8c42-7cf80d74ca61\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.125253 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.125268 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.125362 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:02 crc kubenswrapper[4764]: E0320 14:54:02.125968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:02 crc kubenswrapper[4764]: E0320 14:54:02.125676 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.125439 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:02 crc kubenswrapper[4764]: E0320 14:54:02.126541 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:02 crc kubenswrapper[4764]: E0320 14:54:02.126104 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.130055 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.232637 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 14:54:02 crc kubenswrapper[4764]: I0320 14:54:02.243558 4764 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 14:54:03 crc kubenswrapper[4764]: I0320 14:54:03.088479 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" event={"ID":"b0428029-854b-4cb8-8c42-7cf80d74ca61","Type":"ContainerStarted","Data":"24beb8136219cb8c020eec9ab136bb3840f80a54387fa51cad3b0ea396d18da6"} Mar 20 14:54:03 crc kubenswrapper[4764]: I0320 14:54:03.088545 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" event={"ID":"b0428029-854b-4cb8-8c42-7cf80d74ca61","Type":"ContainerStarted","Data":"c6d06043f3f5e88e79eba6bfedb17b570d088e20b50ecc2651a7a6a6d414c5a0"} Mar 20 14:54:04 crc kubenswrapper[4764]: I0320 14:54:04.125937 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:04 crc kubenswrapper[4764]: I0320 14:54:04.125981 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:04 crc kubenswrapper[4764]: I0320 14:54:04.126039 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:04 crc kubenswrapper[4764]: E0320 14:54:04.126166 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:04 crc kubenswrapper[4764]: E0320 14:54:04.126294 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:04 crc kubenswrapper[4764]: I0320 14:54:04.126408 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:04 crc kubenswrapper[4764]: E0320 14:54:04.126476 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:04 crc kubenswrapper[4764]: E0320 14:54:04.126502 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:04 crc kubenswrapper[4764]: E0320 14:54:04.279869 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:54:06 crc kubenswrapper[4764]: I0320 14:54:06.125886 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:06 crc kubenswrapper[4764]: I0320 14:54:06.125957 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:06 crc kubenswrapper[4764]: I0320 14:54:06.126010 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:06 crc kubenswrapper[4764]: I0320 14:54:06.126131 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:06 crc kubenswrapper[4764]: E0320 14:54:06.126132 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:06 crc kubenswrapper[4764]: E0320 14:54:06.126274 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:06 crc kubenswrapper[4764]: E0320 14:54:06.126365 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:06 crc kubenswrapper[4764]: E0320 14:54:06.126543 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:08 crc kubenswrapper[4764]: I0320 14:54:08.126189 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:08 crc kubenswrapper[4764]: I0320 14:54:08.126236 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:08 crc kubenswrapper[4764]: I0320 14:54:08.126254 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:08 crc kubenswrapper[4764]: I0320 14:54:08.126189 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:08 crc kubenswrapper[4764]: E0320 14:54:08.126370 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:08 crc kubenswrapper[4764]: E0320 14:54:08.126510 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:08 crc kubenswrapper[4764]: E0320 14:54:08.126648 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:08 crc kubenswrapper[4764]: E0320 14:54:08.126779 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:09 crc kubenswrapper[4764]: E0320 14:54:09.280374 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:54:10 crc kubenswrapper[4764]: I0320 14:54:10.125817 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:10 crc kubenswrapper[4764]: I0320 14:54:10.125875 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:10 crc kubenswrapper[4764]: I0320 14:54:10.125939 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:10 crc kubenswrapper[4764]: I0320 14:54:10.126027 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:10 crc kubenswrapper[4764]: E0320 14:54:10.126030 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:10 crc kubenswrapper[4764]: E0320 14:54:10.126160 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:10 crc kubenswrapper[4764]: E0320 14:54:10.126304 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:10 crc kubenswrapper[4764]: E0320 14:54:10.126421 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:12 crc kubenswrapper[4764]: I0320 14:54:12.125228 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:12 crc kubenswrapper[4764]: I0320 14:54:12.125245 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:12 crc kubenswrapper[4764]: I0320 14:54:12.125275 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:12 crc kubenswrapper[4764]: I0320 14:54:12.125319 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:12 crc kubenswrapper[4764]: E0320 14:54:12.126638 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:12 crc kubenswrapper[4764]: E0320 14:54:12.126492 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:12 crc kubenswrapper[4764]: E0320 14:54:12.126761 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:12 crc kubenswrapper[4764]: E0320 14:54:12.126303 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:14 crc kubenswrapper[4764]: I0320 14:54:14.126178 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:14 crc kubenswrapper[4764]: I0320 14:54:14.126219 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:14 crc kubenswrapper[4764]: I0320 14:54:14.126268 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:14 crc kubenswrapper[4764]: I0320 14:54:14.126201 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:14 crc kubenswrapper[4764]: E0320 14:54:14.126362 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:14 crc kubenswrapper[4764]: E0320 14:54:14.126650 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:14 crc kubenswrapper[4764]: E0320 14:54:14.127411 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:14 crc kubenswrapper[4764]: E0320 14:54:14.127496 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:14 crc kubenswrapper[4764]: I0320 14:54:14.127665 4764 scope.go:117] "RemoveContainer" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" Mar 20 14:54:14 crc kubenswrapper[4764]: E0320 14:54:14.128059 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" Mar 20 14:54:14 crc kubenswrapper[4764]: E0320 14:54:14.281449 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:54:16 crc kubenswrapper[4764]: I0320 14:54:16.125574 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:16 crc kubenswrapper[4764]: I0320 14:54:16.125615 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:16 crc kubenswrapper[4764]: I0320 14:54:16.125637 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:16 crc kubenswrapper[4764]: E0320 14:54:16.125768 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:16 crc kubenswrapper[4764]: I0320 14:54:16.125796 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:16 crc kubenswrapper[4764]: E0320 14:54:16.125961 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:16 crc kubenswrapper[4764]: E0320 14:54:16.126095 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:16 crc kubenswrapper[4764]: E0320 14:54:16.126191 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:18 crc kubenswrapper[4764]: I0320 14:54:18.126119 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:18 crc kubenswrapper[4764]: I0320 14:54:18.126184 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:18 crc kubenswrapper[4764]: I0320 14:54:18.126207 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:18 crc kubenswrapper[4764]: E0320 14:54:18.126264 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:18 crc kubenswrapper[4764]: I0320 14:54:18.126292 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:18 crc kubenswrapper[4764]: E0320 14:54:18.126547 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:18 crc kubenswrapper[4764]: E0320 14:54:18.126591 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:18 crc kubenswrapper[4764]: E0320 14:54:18.126698 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:19 crc kubenswrapper[4764]: E0320 14:54:19.282136 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:54:20 crc kubenswrapper[4764]: I0320 14:54:20.125221 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:20 crc kubenswrapper[4764]: I0320 14:54:20.125279 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:20 crc kubenswrapper[4764]: I0320 14:54:20.125371 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:20 crc kubenswrapper[4764]: I0320 14:54:20.125221 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:20 crc kubenswrapper[4764]: E0320 14:54:20.125423 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:20 crc kubenswrapper[4764]: E0320 14:54:20.125566 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:20 crc kubenswrapper[4764]: E0320 14:54:20.125657 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:20 crc kubenswrapper[4764]: E0320 14:54:20.125754 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:22 crc kubenswrapper[4764]: I0320 14:54:22.125720 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:22 crc kubenswrapper[4764]: I0320 14:54:22.125802 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:22 crc kubenswrapper[4764]: I0320 14:54:22.125834 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:22 crc kubenswrapper[4764]: E0320 14:54:22.125918 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:22 crc kubenswrapper[4764]: I0320 14:54:22.125982 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:22 crc kubenswrapper[4764]: E0320 14:54:22.126098 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:22 crc kubenswrapper[4764]: E0320 14:54:22.126184 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:22 crc kubenswrapper[4764]: E0320 14:54:22.126231 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:24 crc kubenswrapper[4764]: I0320 14:54:24.125524 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:24 crc kubenswrapper[4764]: I0320 14:54:24.125633 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:24 crc kubenswrapper[4764]: I0320 14:54:24.125547 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:24 crc kubenswrapper[4764]: I0320 14:54:24.125721 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:24 crc kubenswrapper[4764]: E0320 14:54:24.125714 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:24 crc kubenswrapper[4764]: E0320 14:54:24.126126 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:24 crc kubenswrapper[4764]: E0320 14:54:24.126222 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:24 crc kubenswrapper[4764]: E0320 14:54:24.126325 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:24 crc kubenswrapper[4764]: E0320 14:54:24.284715 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:54:25 crc kubenswrapper[4764]: I0320 14:54:25.126520 4764 scope.go:117] "RemoveContainer" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" Mar 20 14:54:25 crc kubenswrapper[4764]: E0320 14:54:25.126779 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p5lds_openshift-ovn-kubernetes(f2a6c163-0457-4626-9bbb-5628a5155673)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" Mar 20 14:54:26 crc kubenswrapper[4764]: I0320 14:54:26.126222 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:26 crc kubenswrapper[4764]: I0320 14:54:26.126273 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:26 crc kubenswrapper[4764]: E0320 14:54:26.126437 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:26 crc kubenswrapper[4764]: I0320 14:54:26.126482 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:26 crc kubenswrapper[4764]: I0320 14:54:26.126549 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:26 crc kubenswrapper[4764]: E0320 14:54:26.126681 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:26 crc kubenswrapper[4764]: E0320 14:54:26.126767 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:26 crc kubenswrapper[4764]: E0320 14:54:26.126886 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:27 crc kubenswrapper[4764]: I0320 14:54:27.189833 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4m5r_1f85a77d-475e-43c9-8181-093451bc058f/kube-multus/1.log" Mar 20 14:54:27 crc kubenswrapper[4764]: I0320 14:54:27.190560 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4m5r_1f85a77d-475e-43c9-8181-093451bc058f/kube-multus/0.log" Mar 20 14:54:27 crc kubenswrapper[4764]: I0320 14:54:27.190636 4764 generic.go:334] "Generic (PLEG): container finished" podID="1f85a77d-475e-43c9-8181-093451bc058f" containerID="7a45deaf09c399dfadd97c92dbcb7cb9d94d21f2d39069e2f4b7de45135f4abc" exitCode=1 Mar 20 14:54:27 crc kubenswrapper[4764]: I0320 14:54:27.190734 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4m5r" event={"ID":"1f85a77d-475e-43c9-8181-093451bc058f","Type":"ContainerDied","Data":"7a45deaf09c399dfadd97c92dbcb7cb9d94d21f2d39069e2f4b7de45135f4abc"} Mar 20 14:54:27 crc kubenswrapper[4764]: I0320 14:54:27.190800 4764 scope.go:117] "RemoveContainer" containerID="eb002eb49d6fe7043d66b8a3492b12c49bd1531a1f75ad91c9419e232450f582" Mar 20 14:54:27 crc kubenswrapper[4764]: I0320 14:54:27.191525 4764 scope.go:117] "RemoveContainer" containerID="7a45deaf09c399dfadd97c92dbcb7cb9d94d21f2d39069e2f4b7de45135f4abc" Mar 20 14:54:27 crc kubenswrapper[4764]: E0320 14:54:27.191974 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-d4m5r_openshift-multus(1f85a77d-475e-43c9-8181-093451bc058f)\"" pod="openshift-multus/multus-d4m5r" podUID="1f85a77d-475e-43c9-8181-093451bc058f" Mar 20 14:54:27 crc kubenswrapper[4764]: I0320 14:54:27.219715 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-nt8zs" podStartSLOduration=147.219693453 podStartE2EDuration="2m27.219693453s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:03.110877331 +0000 UTC m=+164.727066500" watchObservedRunningTime="2026-03-20 14:54:27.219693453 +0000 UTC m=+188.835882582" Mar 20 14:54:28 crc kubenswrapper[4764]: I0320 14:54:28.126461 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:28 crc kubenswrapper[4764]: I0320 14:54:28.126609 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:28 crc kubenswrapper[4764]: E0320 14:54:28.126670 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:28 crc kubenswrapper[4764]: I0320 14:54:28.126609 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:28 crc kubenswrapper[4764]: E0320 14:54:28.127412 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:28 crc kubenswrapper[4764]: E0320 14:54:28.127556 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:28 crc kubenswrapper[4764]: I0320 14:54:28.127864 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:28 crc kubenswrapper[4764]: E0320 14:54:28.128014 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:28 crc kubenswrapper[4764]: I0320 14:54:28.197766 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4m5r_1f85a77d-475e-43c9-8181-093451bc058f/kube-multus/1.log" Mar 20 14:54:29 crc kubenswrapper[4764]: E0320 14:54:29.285359 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:54:30 crc kubenswrapper[4764]: I0320 14:54:30.125966 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:30 crc kubenswrapper[4764]: E0320 14:54:30.126467 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:30 crc kubenswrapper[4764]: I0320 14:54:30.126037 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:30 crc kubenswrapper[4764]: E0320 14:54:30.126597 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:30 crc kubenswrapper[4764]: I0320 14:54:30.126063 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:30 crc kubenswrapper[4764]: E0320 14:54:30.126733 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:30 crc kubenswrapper[4764]: I0320 14:54:30.126000 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:30 crc kubenswrapper[4764]: E0320 14:54:30.126880 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:32 crc kubenswrapper[4764]: I0320 14:54:32.126004 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:32 crc kubenswrapper[4764]: I0320 14:54:32.126040 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:32 crc kubenswrapper[4764]: I0320 14:54:32.126466 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:32 crc kubenswrapper[4764]: I0320 14:54:32.126483 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:32 crc kubenswrapper[4764]: E0320 14:54:32.126618 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:32 crc kubenswrapper[4764]: E0320 14:54:32.126706 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:32 crc kubenswrapper[4764]: E0320 14:54:32.126795 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:32 crc kubenswrapper[4764]: E0320 14:54:32.126859 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:34 crc kubenswrapper[4764]: I0320 14:54:34.126020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:34 crc kubenswrapper[4764]: I0320 14:54:34.126129 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:34 crc kubenswrapper[4764]: I0320 14:54:34.126045 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:34 crc kubenswrapper[4764]: E0320 14:54:34.126242 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:34 crc kubenswrapper[4764]: I0320 14:54:34.126129 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:34 crc kubenswrapper[4764]: E0320 14:54:34.126442 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:34 crc kubenswrapper[4764]: E0320 14:54:34.126635 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:34 crc kubenswrapper[4764]: E0320 14:54:34.126904 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:34 crc kubenswrapper[4764]: E0320 14:54:34.287683 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:54:36 crc kubenswrapper[4764]: I0320 14:54:36.126104 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:36 crc kubenswrapper[4764]: I0320 14:54:36.126126 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:36 crc kubenswrapper[4764]: I0320 14:54:36.126135 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:36 crc kubenswrapper[4764]: E0320 14:54:36.127448 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:36 crc kubenswrapper[4764]: I0320 14:54:36.126191 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:36 crc kubenswrapper[4764]: E0320 14:54:36.127596 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:36 crc kubenswrapper[4764]: E0320 14:54:36.127745 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:36 crc kubenswrapper[4764]: E0320 14:54:36.127924 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:37 crc kubenswrapper[4764]: I0320 14:54:37.127643 4764 scope.go:117] "RemoveContainer" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" Mar 20 14:54:38 crc kubenswrapper[4764]: I0320 14:54:38.098152 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fb2k7"] Mar 20 14:54:38 crc kubenswrapper[4764]: I0320 14:54:38.098282 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:38 crc kubenswrapper[4764]: E0320 14:54:38.098370 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:38 crc kubenswrapper[4764]: I0320 14:54:38.125691 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:38 crc kubenswrapper[4764]: I0320 14:54:38.125824 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:38 crc kubenswrapper[4764]: E0320 14:54:38.125875 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:38 crc kubenswrapper[4764]: I0320 14:54:38.125902 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:38 crc kubenswrapper[4764]: E0320 14:54:38.126059 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:38 crc kubenswrapper[4764]: E0320 14:54:38.126176 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:38 crc kubenswrapper[4764]: I0320 14:54:38.238316 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/3.log" Mar 20 14:54:38 crc kubenswrapper[4764]: I0320 14:54:38.242674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerStarted","Data":"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e"} Mar 20 14:54:38 crc kubenswrapper[4764]: I0320 14:54:38.243238 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:54:39 crc kubenswrapper[4764]: E0320 14:54:39.288361 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 14:54:40 crc kubenswrapper[4764]: I0320 14:54:40.125152 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:40 crc kubenswrapper[4764]: E0320 14:54:40.125337 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:40 crc kubenswrapper[4764]: I0320 14:54:40.125649 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:40 crc kubenswrapper[4764]: I0320 14:54:40.125727 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:40 crc kubenswrapper[4764]: I0320 14:54:40.125967 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:40 crc kubenswrapper[4764]: E0320 14:54:40.125985 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:40 crc kubenswrapper[4764]: E0320 14:54:40.126074 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:40 crc kubenswrapper[4764]: E0320 14:54:40.125741 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:42 crc kubenswrapper[4764]: I0320 14:54:42.125831 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:42 crc kubenswrapper[4764]: I0320 14:54:42.125844 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:42 crc kubenswrapper[4764]: I0320 14:54:42.125904 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:42 crc kubenswrapper[4764]: I0320 14:54:42.125917 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:42 crc kubenswrapper[4764]: E0320 14:54:42.126198 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:42 crc kubenswrapper[4764]: I0320 14:54:42.126478 4764 scope.go:117] "RemoveContainer" containerID="7a45deaf09c399dfadd97c92dbcb7cb9d94d21f2d39069e2f4b7de45135f4abc" Mar 20 14:54:42 crc kubenswrapper[4764]: E0320 14:54:42.127131 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:42 crc kubenswrapper[4764]: E0320 14:54:42.127221 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:42 crc kubenswrapper[4764]: E0320 14:54:42.127295 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:42 crc kubenswrapper[4764]: I0320 14:54:42.142192 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podStartSLOduration=161.142171291 podStartE2EDuration="2m41.142171291s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:38.287192829 +0000 UTC m=+199.903381998" watchObservedRunningTime="2026-03-20 14:54:42.142171291 +0000 UTC m=+203.758360440" Mar 20 14:54:43 crc kubenswrapper[4764]: I0320 14:54:43.263229 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4m5r_1f85a77d-475e-43c9-8181-093451bc058f/kube-multus/1.log" Mar 20 14:54:43 crc kubenswrapper[4764]: I0320 14:54:43.263674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4m5r" event={"ID":"1f85a77d-475e-43c9-8181-093451bc058f","Type":"ContainerStarted","Data":"3672cc3a563a8bf393194d9c28a5c0bf757103d69c941de1407add1cb9efe136"} Mar 20 14:54:44 crc kubenswrapper[4764]: I0320 14:54:44.126032 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:44 crc kubenswrapper[4764]: I0320 14:54:44.126433 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:44 crc kubenswrapper[4764]: I0320 14:54:44.126225 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:44 crc kubenswrapper[4764]: I0320 14:54:44.126157 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:44 crc kubenswrapper[4764]: E0320 14:54:44.126835 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fb2k7" podUID="4c881e2f-a84e-4621-9e1e-f2197d698a63" Mar 20 14:54:44 crc kubenswrapper[4764]: E0320 14:54:44.126986 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 14:54:44 crc kubenswrapper[4764]: E0320 14:54:44.127083 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 14:54:44 crc kubenswrapper[4764]: E0320 14:54:44.127252 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.088862 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:46 crc kubenswrapper[4764]: E0320 14:54:46.089077 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:54:46 crc kubenswrapper[4764]: E0320 14:54:46.089208 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 14:56:48.08917692 +0000 UTC m=+329.705366079 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.125493 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.126156 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.126694 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.126903 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.131770 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.132628 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.133007 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.133213 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.133503 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.133681 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.189640 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:46 crc kubenswrapper[4764]: E0320 14:54:46.189781 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:56:48.189748656 +0000 UTC m=+329.805937825 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.189881 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.189935 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.190021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.190071 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.198028 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.198058 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.198413 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c881e2f-a84e-4621-9e1e-f2197d698a63-metrics-certs\") pod \"network-metrics-daemon-fb2k7\" (UID: \"4c881e2f-a84e-4621-9e1e-f2197d698a63\") " pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.200471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.449038 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fb2k7" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.466305 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.482232 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:46 crc kubenswrapper[4764]: I0320 14:54:46.717664 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fb2k7"] Mar 20 14:54:46 crc kubenswrapper[4764]: W0320 14:54:46.728972 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c881e2f_a84e_4621_9e1e_f2197d698a63.slice/crio-a2681d7c6c01b65e609d71d038b57c33f5c2a8ada28c72efb4eee4f8365280a1 WatchSource:0}: Error finding container a2681d7c6c01b65e609d71d038b57c33f5c2a8ada28c72efb4eee4f8365280a1: Status 404 returned error can't find the container with id a2681d7c6c01b65e609d71d038b57c33f5c2a8ada28c72efb4eee4f8365280a1 Mar 20 14:54:46 crc kubenswrapper[4764]: W0320 14:54:46.765140 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-24c38583cbfd3f641fbeccb7777e236c9175ceee9f62edf6e812a742978d7a37 WatchSource:0}: Error finding container 24c38583cbfd3f641fbeccb7777e236c9175ceee9f62edf6e812a742978d7a37: Status 404 returned error can't find the container with id 24c38583cbfd3f641fbeccb7777e236c9175ceee9f62edf6e812a742978d7a37 Mar 20 14:54:46 crc kubenswrapper[4764]: W0320 14:54:46.776462 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-1b2599e08c5c9abc1a164ae4997260722e435a47810a3e912318bc24d1b93907 WatchSource:0}: Error finding container 1b2599e08c5c9abc1a164ae4997260722e435a47810a3e912318bc24d1b93907: Status 404 returned error can't find the container with id 1b2599e08c5c9abc1a164ae4997260722e435a47810a3e912318bc24d1b93907 Mar 20 14:54:47 crc kubenswrapper[4764]: I0320 14:54:47.291684 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" event={"ID":"4c881e2f-a84e-4621-9e1e-f2197d698a63","Type":"ContainerStarted","Data":"b58eaf9049d645acf57695be2c28bbf5951c89e73ca13434643e3d2f7d3165ee"} Mar 20 14:54:47 crc kubenswrapper[4764]: I0320 14:54:47.292122 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" event={"ID":"4c881e2f-a84e-4621-9e1e-f2197d698a63","Type":"ContainerStarted","Data":"3f8e99de669c18282b026e1ad96633a93ef850da2d2babacda23df25776a92e0"} Mar 20 14:54:47 crc kubenswrapper[4764]: I0320 14:54:47.292145 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fb2k7" event={"ID":"4c881e2f-a84e-4621-9e1e-f2197d698a63","Type":"ContainerStarted","Data":"a2681d7c6c01b65e609d71d038b57c33f5c2a8ada28c72efb4eee4f8365280a1"} Mar 20 14:54:47 crc kubenswrapper[4764]: I0320 14:54:47.295906 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b9ff40d44337b303789351ab6d41e9b732276de178953579701fb2de103caa32"} Mar 20 14:54:47 crc kubenswrapper[4764]: I0320 14:54:47.295981 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"24c38583cbfd3f641fbeccb7777e236c9175ceee9f62edf6e812a742978d7a37"} Mar 20 14:54:47 crc kubenswrapper[4764]: I0320 14:54:47.296332 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:54:47 crc kubenswrapper[4764]: I0320 14:54:47.303279 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ec0580512f8afba1370544fd1156ee45d85c6183ea17255fd44c0fc74864c722"} Mar 20 14:54:47 crc kubenswrapper[4764]: I0320 14:54:47.303339 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1b2599e08c5c9abc1a164ae4997260722e435a47810a3e912318bc24d1b93907"} Mar 20 14:54:47 crc kubenswrapper[4764]: I0320 14:54:47.327056 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fb2k7" podStartSLOduration=166.327033876 podStartE2EDuration="2m46.327033876s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:47.31074292 +0000 UTC m=+208.926932119" watchObservedRunningTime="2026-03-20 14:54:47.327033876 +0000 UTC m=+208.943223045" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.167580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.212307 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8gwvk"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.213423 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cmp9t"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.213731 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.214499 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.214962 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.215667 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.216075 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.216554 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.217794 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v94wn"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.218467 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.219445 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.220006 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.220930 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.222129 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.226608 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.227948 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.252810 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-752qt"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.253564 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.253948 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.258796 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.259215 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.259545 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.259556 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.260031 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.260148 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.260242 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.260359 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.260419 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.260531 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.261035 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-752qt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.261354 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.261572 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kctmb"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.261700 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.261859 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.261937 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.262006 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.262138 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.262294 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.262454 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.262582 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.263118 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.262600 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.263015 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.263466 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.263599 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.263703 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.263793 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.265043 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.265309 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.265325 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.265517 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.265628 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.265794 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.265842 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.265950 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.266096 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.266259 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.266503 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-5z97v"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.266705 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.267064 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sppmr"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.267402 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mvhpw"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.267911 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mvhpw" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.268084 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.268241 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.268560 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.268611 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.269120 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.269292 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.269924 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.270071 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.270240 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.270926 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.272572 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.276172 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.276447 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.276479 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.276599 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.276911 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.277459 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.277938 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qxxq7"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.278535 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.282433 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.283158 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jpz9j"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.283464 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.283924 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.284682 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.284877 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.286398 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.290110 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.293640 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-etcd-client\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.293734 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7bc79882-ee25-421d-abfc-7d2684bd348f-images\") pod \"machine-api-operator-5694c8668f-cmp9t\" (UID: \"7bc79882-ee25-421d-abfc-7d2684bd348f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.293763 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdstj\" (UniqueName: \"kubernetes.io/projected/3c51fd09-c129-48bf-9bf8-2d455b230386-kube-api-access-kdstj\") pod \"route-controller-manager-6576b87f9c-scc4x\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.293793 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-image-import-ca\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.293817 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l62bl\" (UniqueName: \"kubernetes.io/projected/83c08f82-bea9-451a-b1a5-a98b77e1502e-kube-api-access-l62bl\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.293855 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.293876 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.293914 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-serving-cert\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.293931 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3ca73323-8a03-4812-8cbd-5f22cd297759-machine-approver-tls\") pod \"machine-approver-56656f9798-c8dzl\" (UID: \"3ca73323-8a03-4812-8cbd-5f22cd297759\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.293949 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe5af817-ef26-4ebf-a14b-bae0470f4fd8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4m6cl\" (UID: \"fe5af817-ef26-4ebf-a14b-bae0470f4fd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.293965 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ptlr\" (UniqueName: \"kubernetes.io/projected/3ca73323-8a03-4812-8cbd-5f22cd297759-kube-api-access-4ptlr\") pod \"machine-approver-56656f9798-c8dzl\" (UID: \"3ca73323-8a03-4812-8cbd-5f22cd297759\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.293982 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-serving-cert\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294012 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d1f9f1-ce2a-4c7a-9410-b0b19daa6179-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8xtrp\" (UID: \"04d1f9f1-ce2a-4c7a-9410-b0b19daa6179\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294028 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c51fd09-c129-48bf-9bf8-2d455b230386-config\") pod \"route-controller-manager-6576b87f9c-scc4x\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294048 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc79882-ee25-421d-abfc-7d2684bd348f-config\") pod \"machine-api-operator-5694c8668f-cmp9t\" (UID: \"7bc79882-ee25-421d-abfc-7d2684bd348f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294066 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca73323-8a03-4812-8cbd-5f22cd297759-config\") pod \"machine-approver-56656f9798-c8dzl\" (UID: \"3ca73323-8a03-4812-8cbd-5f22cd297759\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294084 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvrrw\" (UniqueName: \"kubernetes.io/projected/fe5af817-ef26-4ebf-a14b-bae0470f4fd8-kube-api-access-tvrrw\") pod \"cluster-samples-operator-665b6dd947-4m6cl\" (UID: \"fe5af817-ef26-4ebf-a14b-bae0470f4fd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294101 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-audit-policies\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294117 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294134 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294153 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-etcd-serving-ca\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294173 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-audit\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294192 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l67wr\" (UniqueName: \"kubernetes.io/projected/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-kube-api-access-l67wr\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294217 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c51fd09-c129-48bf-9bf8-2d455b230386-serving-cert\") pod \"route-controller-manager-6576b87f9c-scc4x\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294233 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83c08f82-bea9-451a-b1a5-a98b77e1502e-serving-cert\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294249 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/83c08f82-bea9-451a-b1a5-a98b77e1502e-audit-dir\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294269 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc79882-ee25-421d-abfc-7d2684bd348f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cmp9t\" (UID: \"7bc79882-ee25-421d-abfc-7d2684bd348f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294287 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/83c08f82-bea9-451a-b1a5-a98b77e1502e-node-pullsecrets\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294313 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/83c08f82-bea9-451a-b1a5-a98b77e1502e-encryption-config\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294336 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26nb8\" (UniqueName: \"kubernetes.io/projected/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-kube-api-access-26nb8\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294352 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-client-ca\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294371 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-config\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294416 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhgrh\" (UniqueName: \"kubernetes.io/projected/7bc79882-ee25-421d-abfc-7d2684bd348f-kube-api-access-jhgrh\") pod \"machine-api-operator-5694c8668f-cmp9t\" (UID: \"7bc79882-ee25-421d-abfc-7d2684bd348f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294432 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-config\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk7j7\" (UniqueName: \"kubernetes.io/projected/04d1f9f1-ce2a-4c7a-9410-b0b19daa6179-kube-api-access-vk7j7\") pod \"openshift-apiserver-operator-796bbdcf4f-8xtrp\" (UID: \"04d1f9f1-ce2a-4c7a-9410-b0b19daa6179\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294485 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/83c08f82-bea9-451a-b1a5-a98b77e1502e-etcd-client\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294509 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-audit-dir\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294525 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c51fd09-c129-48bf-9bf8-2d455b230386-client-ca\") pod \"route-controller-manager-6576b87f9c-scc4x\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294544 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04d1f9f1-ce2a-4c7a-9410-b0b19daa6179-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8xtrp\" (UID: \"04d1f9f1-ce2a-4c7a-9410-b0b19daa6179\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294561 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-encryption-config\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.294583 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ca73323-8a03-4812-8cbd-5f22cd297759-auth-proxy-config\") pod \"machine-approver-56656f9798-c8dzl\" (UID: \"3ca73323-8a03-4812-8cbd-5f22cd297759\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.308014 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8mljc"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.308599 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.320445 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.321030 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.321547 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.321576 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.321859 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.322988 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.323564 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.328683 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-pjh2l"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.329145 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.329404 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.329759 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.329920 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pjh2l" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.330054 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.331748 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l759h"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.332185 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.336297 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.350236 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.350453 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.369542 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hkbns"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.370098 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.370515 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.370897 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vzfwv"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.371285 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v94wn"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.371354 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.371446 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566974-5687l"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.371489 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.371504 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.371516 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.371567 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.372747 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cmp9t"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.372779 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.373105 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566974-5687l" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.373197 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.373353 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-47trd"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.373878 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.374445 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vfjq2"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.375080 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vfjq2" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.377321 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.377812 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8gwvk"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.377830 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.377922 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.378400 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.379193 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9r7l6"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.380560 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9r7l6" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.396849 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.396887 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l62bl\" (UniqueName: \"kubernetes.io/projected/83c08f82-bea9-451a-b1a5-a98b77e1502e-kube-api-access-l62bl\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.396915 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.396937 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fb145565-51bb-4217-b1c0-fec824da2124-default-certificate\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.396961 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-serving-cert\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.396976 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-audit-policies\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.396992 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1d1d43b-6e4a-404c-bcb0-48fded5252b7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jxhgx\" (UID: \"e1d1d43b-6e4a-404c-bcb0-48fded5252b7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397006 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb145565-51bb-4217-b1c0-fec824da2124-metrics-certs\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-serving-cert\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3ca73323-8a03-4812-8cbd-5f22cd297759-machine-approver-tls\") pod \"machine-approver-56656f9798-c8dzl\" (UID: \"3ca73323-8a03-4812-8cbd-5f22cd297759\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397060 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe5af817-ef26-4ebf-a14b-bae0470f4fd8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4m6cl\" (UID: \"fe5af817-ef26-4ebf-a14b-bae0470f4fd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397074 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397090 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-config\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1d1d43b-6e4a-404c-bcb0-48fded5252b7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jxhgx\" (UID: \"e1d1d43b-6e4a-404c-bcb0-48fded5252b7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c6ef043-f571-4aff-90e8-a07752e9086c-serving-cert\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ptlr\" (UniqueName: \"kubernetes.io/projected/3ca73323-8a03-4812-8cbd-5f22cd297759-kube-api-access-4ptlr\") pod \"machine-approver-56656f9798-c8dzl\" (UID: \"3ca73323-8a03-4812-8cbd-5f22cd297759\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397161 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-serving-cert\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d720bf9a-7a1a-422b-b20e-158635d6f293-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-l4j5r\" (UID: \"d720bf9a-7a1a-422b-b20e-158635d6f293\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397197 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397215 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b151a2bb-8540-48a9-85f6-d8d020bd3d89-trusted-ca\") pod \"console-operator-58897d9998-qxxq7\" (UID: \"b151a2bb-8540-48a9-85f6-d8d020bd3d89\") " pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397233 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d1f9f1-ce2a-4c7a-9410-b0b19daa6179-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8xtrp\" (UID: \"04d1f9f1-ce2a-4c7a-9410-b0b19daa6179\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397249 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j6wc\" (UniqueName: \"kubernetes.io/projected/b1c0c7a8-94b2-434d-9680-31ba9ddcc723-kube-api-access-7j6wc\") pod \"machine-config-controller-84d6567774-tbxfl\" (UID: \"b1c0c7a8-94b2-434d-9680-31ba9ddcc723\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397290 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-trusted-ca-bundle\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc79882-ee25-421d-abfc-7d2684bd348f-config\") pod \"machine-api-operator-5694c8668f-cmp9t\" (UID: \"7bc79882-ee25-421d-abfc-7d2684bd348f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397323 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca73323-8a03-4812-8cbd-5f22cd297759-config\") pod \"machine-approver-56656f9798-c8dzl\" (UID: \"3ca73323-8a03-4812-8cbd-5f22cd297759\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397337 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvrrw\" (UniqueName: \"kubernetes.io/projected/fe5af817-ef26-4ebf-a14b-bae0470f4fd8-kube-api-access-tvrrw\") pod \"cluster-samples-operator-665b6dd947-4m6cl\" (UID: \"fe5af817-ef26-4ebf-a14b-bae0470f4fd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c51fd09-c129-48bf-9bf8-2d455b230386-config\") pod \"route-controller-manager-6576b87f9c-scc4x\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397369 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cc20998-b9bf-498c-85eb-037843ae0bc6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zzvqq\" (UID: \"1cc20998-b9bf-498c-85eb-037843ae0bc6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397409 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fjxr\" (UniqueName: \"kubernetes.io/projected/08bd5c50-7656-4a0a-9d9e-9f79eead7527-kube-api-access-5fjxr\") pod \"control-plane-machine-set-operator-78cbb6b69f-xmwp9\" (UID: \"08bd5c50-7656-4a0a-9d9e-9f79eead7527\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397426 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-audit-policies\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397442 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397458 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mprhh\" (UniqueName: \"kubernetes.io/projected/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-kube-api-access-mprhh\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397473 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397492 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-etcd-serving-ca\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397507 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsvzc\" (UniqueName: \"kubernetes.io/projected/d720bf9a-7a1a-422b-b20e-158635d6f293-kube-api-access-tsvzc\") pod \"kube-storage-version-migrator-operator-b67b599dd-l4j5r\" (UID: \"d720bf9a-7a1a-422b-b20e-158635d6f293\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397522 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397539 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5d60062-e5c3-4d3c-bae9-7c3272c16a17-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9z78m\" (UID: \"c5d60062-e5c3-4d3c-bae9-7c3272c16a17\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-audit\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397576 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j685\" (UniqueName: \"kubernetes.io/projected/54649431-46e8-4a08-a142-6a281092660b-kube-api-access-5j685\") pod \"openshift-controller-manager-operator-756b6f6bc6-6c58z\" (UID: \"54649431-46e8-4a08-a142-6a281092660b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397594 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d8bf31-ce5e-4f52-9394-1711d8b1f060-config\") pod \"kube-controller-manager-operator-78b949d7b-fln9z\" (UID: \"45d8bf31-ce5e-4f52-9394-1711d8b1f060\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397626 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fb145565-51bb-4217-b1c0-fec824da2124-stats-auth\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397643 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1c0c7a8-94b2-434d-9680-31ba9ddcc723-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tbxfl\" (UID: \"b1c0c7a8-94b2-434d-9680-31ba9ddcc723\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397658 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1cc20998-b9bf-498c-85eb-037843ae0bc6-metrics-tls\") pod \"ingress-operator-5b745b69d9-zzvqq\" (UID: \"1cc20998-b9bf-498c-85eb-037843ae0bc6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397674 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea5fe851-776a-4cc8-b365-dea09cc3467a-proxy-tls\") pod \"machine-config-operator-74547568cd-q9j5z\" (UID: \"ea5fe851-776a-4cc8-b365-dea09cc3467a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397691 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l67wr\" (UniqueName: \"kubernetes.io/projected/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-kube-api-access-l67wr\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397707 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rpw2\" (UniqueName: \"kubernetes.io/projected/e1d1d43b-6e4a-404c-bcb0-48fded5252b7-kube-api-access-5rpw2\") pod \"cluster-image-registry-operator-dc59b4c8b-jxhgx\" (UID: \"e1d1d43b-6e4a-404c-bcb0-48fded5252b7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397724 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c51fd09-c129-48bf-9bf8-2d455b230386-serving-cert\") pod \"route-controller-manager-6576b87f9c-scc4x\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397741 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1cc20998-b9bf-498c-85eb-037843ae0bc6-trusted-ca\") pod \"ingress-operator-5b745b69d9-zzvqq\" (UID: \"1cc20998-b9bf-498c-85eb-037843ae0bc6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397760 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d720bf9a-7a1a-422b-b20e-158635d6f293-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-l4j5r\" (UID: \"d720bf9a-7a1a-422b-b20e-158635d6f293\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397795 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667df1e6-264f-40c2-a45f-f50c1cf0b88a-config\") pod \"kube-apiserver-operator-766d6c64bb-wqjvc\" (UID: \"667df1e6-264f-40c2-a45f-f50c1cf0b88a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c6ef043-f571-4aff-90e8-a07752e9086c-service-ca-bundle\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc79882-ee25-421d-abfc-7d2684bd348f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cmp9t\" (UID: \"7bc79882-ee25-421d-abfc-7d2684bd348f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83c08f82-bea9-451a-b1a5-a98b77e1502e-serving-cert\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397867 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/83c08f82-bea9-451a-b1a5-a98b77e1502e-audit-dir\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397884 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397904 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/83c08f82-bea9-451a-b1a5-a98b77e1502e-node-pullsecrets\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397922 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/83c08f82-bea9-451a-b1a5-a98b77e1502e-encryption-config\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397938 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/667df1e6-264f-40c2-a45f-f50c1cf0b88a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wqjvc\" (UID: \"667df1e6-264f-40c2-a45f-f50c1cf0b88a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397956 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26nb8\" (UniqueName: \"kubernetes.io/projected/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-kube-api-access-26nb8\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397972 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d06b4728-f677-451f-9b6a-23055e2dde6f-serving-cert\") pod \"openshift-config-operator-7777fb866f-hf2cz\" (UID: \"d06b4728-f677-451f-9b6a-23055e2dde6f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.397988 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d60062-e5c3-4d3c-bae9-7c3272c16a17-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9z78m\" (UID: \"c5d60062-e5c3-4d3c-bae9-7c3272c16a17\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398003 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5d60062-e5c3-4d3c-bae9-7c3272c16a17-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9z78m\" (UID: \"c5d60062-e5c3-4d3c-bae9-7c3272c16a17\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398020 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-config\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-client-ca\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398053 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/08bd5c50-7656-4a0a-9d9e-9f79eead7527-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xmwp9\" (UID: \"08bd5c50-7656-4a0a-9d9e-9f79eead7527\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398069 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m944h\" (UniqueName: \"kubernetes.io/projected/b2d2f76c-cee1-4eba-bf77-a08cc6de3b7b-kube-api-access-m944h\") pod \"migrator-59844c95c7-pjh2l\" (UID: \"b2d2f76c-cee1-4eba-bf77-a08cc6de3b7b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pjh2l" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398085 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398100 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-oauth-config\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398117 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhgrh\" (UniqueName: \"kubernetes.io/projected/7bc79882-ee25-421d-abfc-7d2684bd348f-kube-api-access-jhgrh\") pod \"machine-api-operator-5694c8668f-cmp9t\" (UID: \"7bc79882-ee25-421d-abfc-7d2684bd348f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398132 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-config\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1c0c7a8-94b2-434d-9680-31ba9ddcc723-proxy-tls\") pod \"machine-config-controller-84d6567774-tbxfl\" (UID: \"b1c0c7a8-94b2-434d-9680-31ba9ddcc723\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398165 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c6ef043-f571-4aff-90e8-a07752e9086c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398191 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1eb8df89-b3e4-4686-b0ed-b4ff1f840c66-metrics-tls\") pod \"dns-operator-744455d44c-mvhpw\" (UID: \"1eb8df89-b3e4-4686-b0ed-b4ff1f840c66\") " pod="openshift-dns-operator/dns-operator-744455d44c-mvhpw" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398208 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7btl\" (UniqueName: \"kubernetes.io/projected/ea5fe851-776a-4cc8-b365-dea09cc3467a-kube-api-access-z7btl\") pod \"machine-config-operator-74547568cd-q9j5z\" (UID: \"ea5fe851-776a-4cc8-b365-dea09cc3467a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398224 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-service-ca\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398240 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54649431-46e8-4a08-a142-6a281092660b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6c58z\" (UID: \"54649431-46e8-4a08-a142-6a281092660b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398255 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ea5fe851-776a-4cc8-b365-dea09cc3467a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q9j5z\" (UID: \"ea5fe851-776a-4cc8-b365-dea09cc3467a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398271 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vwgk\" (UniqueName: \"kubernetes.io/projected/1eb8df89-b3e4-4686-b0ed-b4ff1f840c66-kube-api-access-7vwgk\") pod \"dns-operator-744455d44c-mvhpw\" (UID: \"1eb8df89-b3e4-4686-b0ed-b4ff1f840c66\") " pod="openshift-dns-operator/dns-operator-744455d44c-mvhpw" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398288 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b151a2bb-8540-48a9-85f6-d8d020bd3d89-config\") pod \"console-operator-58897d9998-qxxq7\" (UID: \"b151a2bb-8540-48a9-85f6-d8d020bd3d89\") " pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398303 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc7mb\" (UniqueName: \"kubernetes.io/projected/d06b4728-f677-451f-9b6a-23055e2dde6f-kube-api-access-dc7mb\") pod \"openshift-config-operator-7777fb866f-hf2cz\" (UID: \"d06b4728-f677-451f-9b6a-23055e2dde6f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398319 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6ef043-f571-4aff-90e8-a07752e9086c-config\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398336 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk7j7\" (UniqueName: \"kubernetes.io/projected/04d1f9f1-ce2a-4c7a-9410-b0b19daa6179-kube-api-access-vk7j7\") pod \"openshift-apiserver-operator-796bbdcf4f-8xtrp\" (UID: \"04d1f9f1-ce2a-4c7a-9410-b0b19daa6179\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398351 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/83c08f82-bea9-451a-b1a5-a98b77e1502e-etcd-client\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398367 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt45n\" (UniqueName: \"kubernetes.io/projected/82463101-a3d9-4a1b-a180-aba0318fbeb4-kube-api-access-vt45n\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398404 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398422 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398440 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54649431-46e8-4a08-a142-6a281092660b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6c58z\" (UID: \"54649431-46e8-4a08-a142-6a281092660b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398462 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04d1f9f1-ce2a-4c7a-9410-b0b19daa6179-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8xtrp\" (UID: \"04d1f9f1-ce2a-4c7a-9410-b0b19daa6179\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398478 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-encryption-config\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398496 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-audit-dir\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398510 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c51fd09-c129-48bf-9bf8-2d455b230386-client-ca\") pod \"route-controller-manager-6576b87f9c-scc4x\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398527 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ea5fe851-776a-4cc8-b365-dea09cc3467a-images\") pod \"machine-config-operator-74547568cd-q9j5z\" (UID: \"ea5fe851-776a-4cc8-b365-dea09cc3467a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398543 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktrws\" (UniqueName: \"kubernetes.io/projected/0c6ef043-f571-4aff-90e8-a07752e9086c-kube-api-access-ktrws\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398559 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjln8\" (UniqueName: \"kubernetes.io/projected/fe82f329-50db-4717-aa9b-6245253449cf-kube-api-access-xjln8\") pod \"downloads-7954f5f757-752qt\" (UID: \"fe82f329-50db-4717-aa9b-6245253449cf\") " pod="openshift-console/downloads-7954f5f757-752qt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398574 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb145565-51bb-4217-b1c0-fec824da2124-service-ca-bundle\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398590 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d06b4728-f677-451f-9b6a-23055e2dde6f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hf2cz\" (UID: \"d06b4728-f677-451f-9b6a-23055e2dde6f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398608 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ca73323-8a03-4812-8cbd-5f22cd297759-auth-proxy-config\") pod \"machine-approver-56656f9798-c8dzl\" (UID: \"3ca73323-8a03-4812-8cbd-5f22cd297759\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398624 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c7xc\" (UniqueName: \"kubernetes.io/projected/1cc20998-b9bf-498c-85eb-037843ae0bc6-kube-api-access-4c7xc\") pod \"ingress-operator-5b745b69d9-zzvqq\" (UID: \"1cc20998-b9bf-498c-85eb-037843ae0bc6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398641 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b151a2bb-8540-48a9-85f6-d8d020bd3d89-serving-cert\") pod \"console-operator-58897d9998-qxxq7\" (UID: \"b151a2bb-8540-48a9-85f6-d8d020bd3d89\") " pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398658 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h56z\" (UniqueName: \"kubernetes.io/projected/b151a2bb-8540-48a9-85f6-d8d020bd3d89-kube-api-access-2h56z\") pod \"console-operator-58897d9998-qxxq7\" (UID: \"b151a2bb-8540-48a9-85f6-d8d020bd3d89\") " pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398674 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1d1d43b-6e4a-404c-bcb0-48fded5252b7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jxhgx\" (UID: \"e1d1d43b-6e4a-404c-bcb0-48fded5252b7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398691 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-audit-dir\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398707 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-etcd-client\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398725 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398740 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvp26\" (UniqueName: \"kubernetes.io/projected/fb145565-51bb-4217-b1c0-fec824da2124-kube-api-access-cvp26\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398760 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398776 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7bc79882-ee25-421d-abfc-7d2684bd348f-images\") pod \"machine-api-operator-5694c8668f-cmp9t\" (UID: \"7bc79882-ee25-421d-abfc-7d2684bd348f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398793 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdstj\" (UniqueName: \"kubernetes.io/projected/3c51fd09-c129-48bf-9bf8-2d455b230386-kube-api-access-kdstj\") pod \"route-controller-manager-6576b87f9c-scc4x\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d8bf31-ce5e-4f52-9394-1711d8b1f060-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fln9z\" (UID: \"45d8bf31-ce5e-4f52-9394-1711d8b1f060\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398827 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/667df1e6-264f-40c2-a45f-f50c1cf0b88a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wqjvc\" (UID: \"667df1e6-264f-40c2-a45f-f50c1cf0b88a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-image-import-ca\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398859 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45d8bf31-ce5e-4f52-9394-1711d8b1f060-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fln9z\" (UID: \"45d8bf31-ce5e-4f52-9394-1711d8b1f060\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.398876 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-oauth-serving-cert\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.400587 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.400858 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.403117 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.404103 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.404533 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.404666 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.404814 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.407900 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-client-ca\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.408457 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.408539 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-config\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.408797 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.409510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.409834 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.409942 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.410037 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.410137 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7bc79882-ee25-421d-abfc-7d2684bd348f-images\") pod \"machine-api-operator-5694c8668f-cmp9t\" (UID: \"7bc79882-ee25-421d-abfc-7d2684bd348f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.410154 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.410168 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.410306 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.410520 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.410916 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-audit\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.411457 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-serving-cert\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.411530 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c51fd09-c129-48bf-9bf8-2d455b230386-config\") pod \"route-controller-manager-6576b87f9c-scc4x\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.412029 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.412064 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04d1f9f1-ce2a-4c7a-9410-b0b19daa6179-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8xtrp\" (UID: \"04d1f9f1-ce2a-4c7a-9410-b0b19daa6179\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.412652 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bc79882-ee25-421d-abfc-7d2684bd348f-config\") pod \"machine-api-operator-5694c8668f-cmp9t\" (UID: \"7bc79882-ee25-421d-abfc-7d2684bd348f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.412955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.412998 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca73323-8a03-4812-8cbd-5f22cd297759-config\") pod \"machine-approver-56656f9798-c8dzl\" (UID: \"3ca73323-8a03-4812-8cbd-5f22cd297759\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.413371 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-etcd-serving-ca\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.413749 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kctmb"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.413784 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.414295 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-752qt"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.416336 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.416588 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/83c08f82-bea9-451a-b1a5-a98b77e1502e-audit-dir\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.416894 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.416464 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/83c08f82-bea9-451a-b1a5-a98b77e1502e-image-import-ca\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.416476 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.417491 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.417801 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.418039 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.418363 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.418559 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.418702 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.418847 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.420297 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ca73323-8a03-4812-8cbd-5f22cd297759-auth-proxy-config\") pod \"machine-approver-56656f9798-c8dzl\" (UID: \"3ca73323-8a03-4812-8cbd-5f22cd297759\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.421671 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.421912 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.422131 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.422325 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.422576 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.422834 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.439124 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-config\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.439578 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe5af817-ef26-4ebf-a14b-bae0470f4fd8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4m6cl\" (UID: \"fe5af817-ef26-4ebf-a14b-bae0470f4fd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.447088 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04d1f9f1-ce2a-4c7a-9410-b0b19daa6179-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8xtrp\" (UID: \"04d1f9f1-ce2a-4c7a-9410-b0b19daa6179\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.448331 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc79882-ee25-421d-abfc-7d2684bd348f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cmp9t\" (UID: \"7bc79882-ee25-421d-abfc-7d2684bd348f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.448846 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3ca73323-8a03-4812-8cbd-5f22cd297759-machine-approver-tls\") pod \"machine-approver-56656f9798-c8dzl\" (UID: \"3ca73323-8a03-4812-8cbd-5f22cd297759\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.450179 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.450516 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83c08f82-bea9-451a-b1a5-a98b77e1502e-serving-cert\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.450868 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/83c08f82-bea9-451a-b1a5-a98b77e1502e-etcd-client\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.451709 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.454707 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.454871 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.451894 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.455258 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.452125 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.455536 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.455641 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.454619 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.455940 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/83c08f82-bea9-451a-b1a5-a98b77e1502e-node-pullsecrets\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.455854 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5z97v"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.455076 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.456514 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-audit-dir\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.457130 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c51fd09-c129-48bf-9bf8-2d455b230386-client-ca\") pod \"route-controller-manager-6576b87f9c-scc4x\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.457438 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.458482 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-audit-policies\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.460979 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/83c08f82-bea9-451a-b1a5-a98b77e1502e-encryption-config\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.461061 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c51fd09-c129-48bf-9bf8-2d455b230386-serving-cert\") pod \"route-controller-manager-6576b87f9c-scc4x\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.464903 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.469043 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.471182 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.471309 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.471571 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.472022 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.481409 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mvhpw"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.472106 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.472215 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.482863 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-encryption-config\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.488919 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.489361 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.489757 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-etcd-client\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.493874 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.494077 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.494096 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.494353 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.494490 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.494405 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.495027 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.495129 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.495248 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.495299 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x8hb8"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.495449 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-serving-cert\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.496231 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.498516 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.498577 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qxxq7"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.499258 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-pjh2l"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.499818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.499857 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.499891 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt45n\" (UniqueName: \"kubernetes.io/projected/82463101-a3d9-4a1b-a180-aba0318fbeb4-kube-api-access-vt45n\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.499916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54649431-46e8-4a08-a142-6a281092660b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6c58z\" (UID: \"54649431-46e8-4a08-a142-6a281092660b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.499938 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ea5fe851-776a-4cc8-b365-dea09cc3467a-images\") pod \"machine-config-operator-74547568cd-q9j5z\" (UID: \"ea5fe851-776a-4cc8-b365-dea09cc3467a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.499963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d06b4728-f677-451f-9b6a-23055e2dde6f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hf2cz\" (UID: \"d06b4728-f677-451f-9b6a-23055e2dde6f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.499983 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktrws\" (UniqueName: \"kubernetes.io/projected/0c6ef043-f571-4aff-90e8-a07752e9086c-kube-api-access-ktrws\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.499999 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjln8\" (UniqueName: \"kubernetes.io/projected/fe82f329-50db-4717-aa9b-6245253449cf-kube-api-access-xjln8\") pod \"downloads-7954f5f757-752qt\" (UID: \"fe82f329-50db-4717-aa9b-6245253449cf\") " pod="openshift-console/downloads-7954f5f757-752qt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500016 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb145565-51bb-4217-b1c0-fec824da2124-service-ca-bundle\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500032 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c7xc\" (UniqueName: \"kubernetes.io/projected/1cc20998-b9bf-498c-85eb-037843ae0bc6-kube-api-access-4c7xc\") pod \"ingress-operator-5b745b69d9-zzvqq\" (UID: \"1cc20998-b9bf-498c-85eb-037843ae0bc6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500047 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b151a2bb-8540-48a9-85f6-d8d020bd3d89-serving-cert\") pod \"console-operator-58897d9998-qxxq7\" (UID: \"b151a2bb-8540-48a9-85f6-d8d020bd3d89\") " pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500067 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h56z\" (UniqueName: \"kubernetes.io/projected/b151a2bb-8540-48a9-85f6-d8d020bd3d89-kube-api-access-2h56z\") pod \"console-operator-58897d9998-qxxq7\" (UID: \"b151a2bb-8540-48a9-85f6-d8d020bd3d89\") " pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500082 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-audit-dir\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500098 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1d1d43b-6e4a-404c-bcb0-48fded5252b7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jxhgx\" (UID: \"e1d1d43b-6e4a-404c-bcb0-48fded5252b7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500122 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500138 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvp26\" (UniqueName: \"kubernetes.io/projected/fb145565-51bb-4217-b1c0-fec824da2124-kube-api-access-cvp26\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500152 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500177 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d8bf31-ce5e-4f52-9394-1711d8b1f060-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fln9z\" (UID: \"45d8bf31-ce5e-4f52-9394-1711d8b1f060\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500192 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/667df1e6-264f-40c2-a45f-f50c1cf0b88a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wqjvc\" (UID: \"667df1e6-264f-40c2-a45f-f50c1cf0b88a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500207 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-oauth-serving-cert\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500223 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45d8bf31-ce5e-4f52-9394-1711d8b1f060-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fln9z\" (UID: \"45d8bf31-ce5e-4f52-9394-1711d8b1f060\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500244 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500261 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fb145565-51bb-4217-b1c0-fec824da2124-default-certificate\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500291 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1d1d43b-6e4a-404c-bcb0-48fded5252b7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jxhgx\" (UID: \"e1d1d43b-6e4a-404c-bcb0-48fded5252b7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500306 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb145565-51bb-4217-b1c0-fec824da2124-metrics-certs\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500323 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-serving-cert\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500338 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-audit-policies\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500355 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-config\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500370 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1d1d43b-6e4a-404c-bcb0-48fded5252b7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jxhgx\" (UID: \"e1d1d43b-6e4a-404c-bcb0-48fded5252b7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500401 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c6ef043-f571-4aff-90e8-a07752e9086c-serving-cert\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500423 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d720bf9a-7a1a-422b-b20e-158635d6f293-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-l4j5r\" (UID: \"d720bf9a-7a1a-422b-b20e-158635d6f293\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b151a2bb-8540-48a9-85f6-d8d020bd3d89-trusted-ca\") pod \"console-operator-58897d9998-qxxq7\" (UID: \"b151a2bb-8540-48a9-85f6-d8d020bd3d89\") " pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500459 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500476 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j6wc\" (UniqueName: \"kubernetes.io/projected/b1c0c7a8-94b2-434d-9680-31ba9ddcc723-kube-api-access-7j6wc\") pod \"machine-config-controller-84d6567774-tbxfl\" (UID: \"b1c0c7a8-94b2-434d-9680-31ba9ddcc723\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500493 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-trusted-ca-bundle\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500511 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cc20998-b9bf-498c-85eb-037843ae0bc6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zzvqq\" (UID: \"1cc20998-b9bf-498c-85eb-037843ae0bc6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500552 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fjxr\" (UniqueName: \"kubernetes.io/projected/08bd5c50-7656-4a0a-9d9e-9f79eead7527-kube-api-access-5fjxr\") pod \"control-plane-machine-set-operator-78cbb6b69f-xmwp9\" (UID: \"08bd5c50-7656-4a0a-9d9e-9f79eead7527\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500576 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mprhh\" (UniqueName: \"kubernetes.io/projected/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-kube-api-access-mprhh\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500590 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5d60062-e5c3-4d3c-bae9-7c3272c16a17-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9z78m\" (UID: \"c5d60062-e5c3-4d3c-bae9-7c3272c16a17\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500608 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsvzc\" (UniqueName: \"kubernetes.io/projected/d720bf9a-7a1a-422b-b20e-158635d6f293-kube-api-access-tsvzc\") pod \"kube-storage-version-migrator-operator-b67b599dd-l4j5r\" (UID: \"d720bf9a-7a1a-422b-b20e-158635d6f293\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500612 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500626 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500643 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d8bf31-ce5e-4f52-9394-1711d8b1f060-config\") pod \"kube-controller-manager-operator-78b949d7b-fln9z\" (UID: \"45d8bf31-ce5e-4f52-9394-1711d8b1f060\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500660 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500676 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fb145565-51bb-4217-b1c0-fec824da2124-stats-auth\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500693 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j685\" (UniqueName: \"kubernetes.io/projected/54649431-46e8-4a08-a142-6a281092660b-kube-api-access-5j685\") pod \"openshift-controller-manager-operator-756b6f6bc6-6c58z\" (UID: \"54649431-46e8-4a08-a142-6a281092660b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500716 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1c0c7a8-94b2-434d-9680-31ba9ddcc723-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tbxfl\" (UID: \"b1c0c7a8-94b2-434d-9680-31ba9ddcc723\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500732 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1cc20998-b9bf-498c-85eb-037843ae0bc6-metrics-tls\") pod \"ingress-operator-5b745b69d9-zzvqq\" (UID: \"1cc20998-b9bf-498c-85eb-037843ae0bc6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500746 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea5fe851-776a-4cc8-b365-dea09cc3467a-proxy-tls\") pod \"machine-config-operator-74547568cd-q9j5z\" (UID: \"ea5fe851-776a-4cc8-b365-dea09cc3467a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rpw2\" (UniqueName: \"kubernetes.io/projected/e1d1d43b-6e4a-404c-bcb0-48fded5252b7-kube-api-access-5rpw2\") pod \"cluster-image-registry-operator-dc59b4c8b-jxhgx\" (UID: \"e1d1d43b-6e4a-404c-bcb0-48fded5252b7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500779 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1cc20998-b9bf-498c-85eb-037843ae0bc6-trusted-ca\") pod \"ingress-operator-5b745b69d9-zzvqq\" (UID: \"1cc20998-b9bf-498c-85eb-037843ae0bc6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500795 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d720bf9a-7a1a-422b-b20e-158635d6f293-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-l4j5r\" (UID: \"d720bf9a-7a1a-422b-b20e-158635d6f293\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500829 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c6ef043-f571-4aff-90e8-a07752e9086c-service-ca-bundle\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500845 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667df1e6-264f-40c2-a45f-f50c1cf0b88a-config\") pod \"kube-apiserver-operator-766d6c64bb-wqjvc\" (UID: \"667df1e6-264f-40c2-a45f-f50c1cf0b88a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500861 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500881 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/667df1e6-264f-40c2-a45f-f50c1cf0b88a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wqjvc\" (UID: \"667df1e6-264f-40c2-a45f-f50c1cf0b88a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d06b4728-f677-451f-9b6a-23055e2dde6f-serving-cert\") pod \"openshift-config-operator-7777fb866f-hf2cz\" (UID: \"d06b4728-f677-451f-9b6a-23055e2dde6f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500914 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d60062-e5c3-4d3c-bae9-7c3272c16a17-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9z78m\" (UID: \"c5d60062-e5c3-4d3c-bae9-7c3272c16a17\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500931 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5d60062-e5c3-4d3c-bae9-7c3272c16a17-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9z78m\" (UID: \"c5d60062-e5c3-4d3c-bae9-7c3272c16a17\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500957 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/08bd5c50-7656-4a0a-9d9e-9f79eead7527-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xmwp9\" (UID: \"08bd5c50-7656-4a0a-9d9e-9f79eead7527\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500974 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m944h\" (UniqueName: \"kubernetes.io/projected/b2d2f76c-cee1-4eba-bf77-a08cc6de3b7b-kube-api-access-m944h\") pod \"migrator-59844c95c7-pjh2l\" (UID: \"b2d2f76c-cee1-4eba-bf77-a08cc6de3b7b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pjh2l" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.500990 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-oauth-config\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.501005 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.501021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1c0c7a8-94b2-434d-9680-31ba9ddcc723-proxy-tls\") pod \"machine-config-controller-84d6567774-tbxfl\" (UID: \"b1c0c7a8-94b2-434d-9680-31ba9ddcc723\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.501037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c6ef043-f571-4aff-90e8-a07752e9086c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.501066 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1eb8df89-b3e4-4686-b0ed-b4ff1f840c66-metrics-tls\") pod \"dns-operator-744455d44c-mvhpw\" (UID: \"1eb8df89-b3e4-4686-b0ed-b4ff1f840c66\") " pod="openshift-dns-operator/dns-operator-744455d44c-mvhpw" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.501080 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-service-ca\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.501099 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7btl\" (UniqueName: \"kubernetes.io/projected/ea5fe851-776a-4cc8-b365-dea09cc3467a-kube-api-access-z7btl\") pod \"machine-config-operator-74547568cd-q9j5z\" (UID: \"ea5fe851-776a-4cc8-b365-dea09cc3467a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.501115 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54649431-46e8-4a08-a142-6a281092660b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6c58z\" (UID: \"54649431-46e8-4a08-a142-6a281092660b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.501130 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ea5fe851-776a-4cc8-b365-dea09cc3467a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q9j5z\" (UID: \"ea5fe851-776a-4cc8-b365-dea09cc3467a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.501145 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vwgk\" (UniqueName: \"kubernetes.io/projected/1eb8df89-b3e4-4686-b0ed-b4ff1f840c66-kube-api-access-7vwgk\") pod \"dns-operator-744455d44c-mvhpw\" (UID: \"1eb8df89-b3e4-4686-b0ed-b4ff1f840c66\") " pod="openshift-dns-operator/dns-operator-744455d44c-mvhpw" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.501160 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b151a2bb-8540-48a9-85f6-d8d020bd3d89-config\") pod \"console-operator-58897d9998-qxxq7\" (UID: \"b151a2bb-8540-48a9-85f6-d8d020bd3d89\") " pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.501176 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc7mb\" (UniqueName: \"kubernetes.io/projected/d06b4728-f677-451f-9b6a-23055e2dde6f-kube-api-access-dc7mb\") pod \"openshift-config-operator-7777fb866f-hf2cz\" (UID: \"d06b4728-f677-451f-9b6a-23055e2dde6f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.501191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6ef043-f571-4aff-90e8-a07752e9086c-config\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.503014 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-trusted-ca-bundle\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.503203 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.504207 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d06b4728-f677-451f-9b6a-23055e2dde6f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hf2cz\" (UID: \"d06b4728-f677-451f-9b6a-23055e2dde6f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.505020 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-service-ca\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.505844 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-audit-policies\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.506128 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.506511 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.507131 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-config\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.507267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-audit-dir\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.507620 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1c0c7a8-94b2-434d-9680-31ba9ddcc723-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tbxfl\" (UID: \"b1c0c7a8-94b2-434d-9680-31ba9ddcc723\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.507634 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ea5fe851-776a-4cc8-b365-dea09cc3467a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q9j5z\" (UID: \"ea5fe851-776a-4cc8-b365-dea09cc3467a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.507876 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.507994 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.508328 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-oauth-serving-cert\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.508359 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54649431-46e8-4a08-a142-6a281092660b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6c58z\" (UID: \"54649431-46e8-4a08-a142-6a281092660b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.508769 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54649431-46e8-4a08-a142-6a281092660b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6c58z\" (UID: \"54649431-46e8-4a08-a142-6a281092660b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.508968 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.509840 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1eb8df89-b3e4-4686-b0ed-b4ff1f840c66-metrics-tls\") pod \"dns-operator-744455d44c-mvhpw\" (UID: \"1eb8df89-b3e4-4686-b0ed-b4ff1f840c66\") " pod="openshift-dns-operator/dns-operator-744455d44c-mvhpw" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.509852 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.510374 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.512016 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.512060 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hkbns"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.512240 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b151a2bb-8540-48a9-85f6-d8d020bd3d89-serving-cert\") pod \"console-operator-58897d9998-qxxq7\" (UID: \"b151a2bb-8540-48a9-85f6-d8d020bd3d89\") " pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.512441 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.512450 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1d1d43b-6e4a-404c-bcb0-48fded5252b7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-jxhgx\" (UID: \"e1d1d43b-6e4a-404c-bcb0-48fded5252b7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.512859 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-oauth-config\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.513548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-serving-cert\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.513619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b151a2bb-8540-48a9-85f6-d8d020bd3d89-config\") pod \"console-operator-58897d9998-qxxq7\" (UID: \"b151a2bb-8540-48a9-85f6-d8d020bd3d89\") " pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.513701 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.514426 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b151a2bb-8540-48a9-85f6-d8d020bd3d89-trusted-ca\") pod \"console-operator-58897d9998-qxxq7\" (UID: \"b151a2bb-8540-48a9-85f6-d8d020bd3d89\") " pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.515364 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vzfwv"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.515748 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.517488 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.517520 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.517549 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.517848 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d06b4728-f677-451f-9b6a-23055e2dde6f-serving-cert\") pod \"openshift-config-operator-7777fb866f-hf2cz\" (UID: \"d06b4728-f677-451f-9b6a-23055e2dde6f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.518055 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.519146 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1d1d43b-6e4a-404c-bcb0-48fded5252b7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-jxhgx\" (UID: \"e1d1d43b-6e4a-404c-bcb0-48fded5252b7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.523214 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.525482 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.526994 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.529744 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.531141 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sppmr"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.531466 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/667df1e6-264f-40c2-a45f-f50c1cf0b88a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wqjvc\" (UID: \"667df1e6-264f-40c2-a45f-f50c1cf0b88a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.531631 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.532812 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l759h"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.533859 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.535356 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.535361 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cs6rb"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.536132 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cs6rb" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.536971 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qbtvq"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.537439 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qbtvq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.538625 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566974-5687l"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.539697 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.540741 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.541845 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.542998 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8mljc"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.543921 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x8hb8"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.544435 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/667df1e6-264f-40c2-a45f-f50c1cf0b88a-config\") pod \"kube-apiserver-operator-766d6c64bb-wqjvc\" (UID: \"667df1e6-264f-40c2-a45f-f50c1cf0b88a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.544902 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.545912 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9r7l6"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.546939 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-47trd"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.548044 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.549060 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cs6rb"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.550060 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vfjq2"] Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.556004 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.576257 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.587234 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fb145565-51bb-4217-b1c0-fec824da2124-default-certificate\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.596217 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.611614 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fb145565-51bb-4217-b1c0-fec824da2124-stats-auth\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.616192 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.625980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb145565-51bb-4217-b1c0-fec824da2124-metrics-certs\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.635566 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.637454 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb145565-51bb-4217-b1c0-fec824da2124-service-ca-bundle\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.656006 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.676411 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.697375 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.746070 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.747597 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.756099 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.756252 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d8bf31-ce5e-4f52-9394-1711d8b1f060-config\") pod \"kube-controller-manager-operator-78b949d7b-fln9z\" (UID: \"45d8bf31-ce5e-4f52-9394-1711d8b1f060\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.756939 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45d8bf31-ce5e-4f52-9394-1711d8b1f060-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fln9z\" (UID: \"45d8bf31-ce5e-4f52-9394-1711d8b1f060\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.775602 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.796772 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.816069 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.837399 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.867994 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.874347 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1cc20998-b9bf-498c-85eb-037843ae0bc6-trusted-ca\") pod \"ingress-operator-5b745b69d9-zzvqq\" (UID: \"1cc20998-b9bf-498c-85eb-037843ae0bc6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.876761 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.896752 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.917601 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.930816 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1cc20998-b9bf-498c-85eb-037843ae0bc6-metrics-tls\") pod \"ingress-operator-5b745b69d9-zzvqq\" (UID: \"1cc20998-b9bf-498c-85eb-037843ae0bc6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.937230 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.944312 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5d60062-e5c3-4d3c-bae9-7c3272c16a17-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9z78m\" (UID: \"c5d60062-e5c3-4d3c-bae9-7c3272c16a17\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.956203 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.961195 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5d60062-e5c3-4d3c-bae9-7c3272c16a17-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9z78m\" (UID: \"c5d60062-e5c3-4d3c-bae9-7c3272c16a17\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.977234 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 14:54:52 crc kubenswrapper[4764]: I0320 14:54:52.996198 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.017454 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.036127 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.046701 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d720bf9a-7a1a-422b-b20e-158635d6f293-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-l4j5r\" (UID: \"d720bf9a-7a1a-422b-b20e-158635d6f293\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.056630 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.069157 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d720bf9a-7a1a-422b-b20e-158635d6f293-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-l4j5r\" (UID: \"d720bf9a-7a1a-422b-b20e-158635d6f293\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.077135 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.096156 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.100687 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1c0c7a8-94b2-434d-9680-31ba9ddcc723-proxy-tls\") pod \"machine-config-controller-84d6567774-tbxfl\" (UID: \"b1c0c7a8-94b2-434d-9680-31ba9ddcc723\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.116519 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.138137 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.152498 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/08bd5c50-7656-4a0a-9d9e-9f79eead7527-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xmwp9\" (UID: \"08bd5c50-7656-4a0a-9d9e-9f79eead7527\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.156597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.176539 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.196184 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.216425 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.236657 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.256688 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.277343 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.291594 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c6ef043-f571-4aff-90e8-a07752e9086c-serving-cert\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.295866 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.316134 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.322290 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c6ef043-f571-4aff-90e8-a07752e9086c-config\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.346498 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.349365 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c6ef043-f571-4aff-90e8-a07752e9086c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.365413 4764 request.go:700] Waited for 1.015553195s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/configmaps?fieldSelector=metadata.name%3Dservice-ca-bundle&limit=500&resourceVersion=0 Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.366908 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.375501 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c6ef043-f571-4aff-90e8-a07752e9086c-service-ca-bundle\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.396453 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.416171 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.436019 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.456916 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.467995 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea5fe851-776a-4cc8-b365-dea09cc3467a-proxy-tls\") pod \"machine-config-operator-74547568cd-q9j5z\" (UID: \"ea5fe851-776a-4cc8-b365-dea09cc3467a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.476885 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.493769 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ea5fe851-776a-4cc8-b365-dea09cc3467a-images\") pod \"machine-config-operator-74547568cd-q9j5z\" (UID: \"ea5fe851-776a-4cc8-b365-dea09cc3467a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.497029 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.516666 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.536336 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.557083 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.576549 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.596314 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.617273 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.655009 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.656867 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.676929 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.697114 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.716272 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.736467 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.756920 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.776910 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.796088 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.816850 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.836703 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.856090 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.877525 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.897116 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.916518 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.936195 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.957125 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.976924 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 14:54:53 crc kubenswrapper[4764]: I0320 14:54:53.996915 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.016690 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.037092 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.056361 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.077016 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.096978 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.143743 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l62bl\" (UniqueName: \"kubernetes.io/projected/83c08f82-bea9-451a-b1a5-a98b77e1502e-kube-api-access-l62bl\") pod \"apiserver-76f77b778f-8gwvk\" (UID: \"83c08f82-bea9-451a-b1a5-a98b77e1502e\") " pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.184279 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk7j7\" (UniqueName: \"kubernetes.io/projected/04d1f9f1-ce2a-4c7a-9410-b0b19daa6179-kube-api-access-vk7j7\") pod \"openshift-apiserver-operator-796bbdcf4f-8xtrp\" (UID: \"04d1f9f1-ce2a-4c7a-9410-b0b19daa6179\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.203688 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ptlr\" (UniqueName: \"kubernetes.io/projected/3ca73323-8a03-4812-8cbd-5f22cd297759-kube-api-access-4ptlr\") pod \"machine-approver-56656f9798-c8dzl\" (UID: \"3ca73323-8a03-4812-8cbd-5f22cd297759\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.242486 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l67wr\" (UniqueName: \"kubernetes.io/projected/a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6-kube-api-access-l67wr\") pod \"apiserver-7bbb656c7d-x5nqv\" (UID: \"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.262637 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvrrw\" (UniqueName: \"kubernetes.io/projected/fe5af817-ef26-4ebf-a14b-bae0470f4fd8-kube-api-access-tvrrw\") pod \"cluster-samples-operator-665b6dd947-4m6cl\" (UID: \"fe5af817-ef26-4ebf-a14b-bae0470f4fd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.282572 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdstj\" (UniqueName: \"kubernetes.io/projected/3c51fd09-c129-48bf-9bf8-2d455b230386-kube-api-access-kdstj\") pod \"route-controller-manager-6576b87f9c-scc4x\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.296809 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.303634 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26nb8\" (UniqueName: \"kubernetes.io/projected/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-kube-api-access-26nb8\") pod \"controller-manager-879f6c89f-v94wn\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.316338 4764 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.335896 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.374935 4764 request.go:700] Waited for 1.87329847s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.377173 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.382789 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhgrh\" (UniqueName: \"kubernetes.io/projected/7bc79882-ee25-421d-abfc-7d2684bd348f-kube-api-access-jhgrh\") pod \"machine-api-operator-5694c8668f-cmp9t\" (UID: \"7bc79882-ee25-421d-abfc-7d2684bd348f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.386805 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m944h\" (UniqueName: \"kubernetes.io/projected/b2d2f76c-cee1-4eba-bf77-a08cc6de3b7b-kube-api-access-m944h\") pod \"migrator-59844c95c7-pjh2l\" (UID: \"b2d2f76c-cee1-4eba-bf77-a08cc6de3b7b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pjh2l" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.399844 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.404259 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rpw2\" (UniqueName: \"kubernetes.io/projected/e1d1d43b-6e4a-404c-bcb0-48fded5252b7-kube-api-access-5rpw2\") pod \"cluster-image-registry-operator-dc59b4c8b-jxhgx\" (UID: \"e1d1d43b-6e4a-404c-bcb0-48fded5252b7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.408996 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.421010 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.427318 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cc20998-b9bf-498c-85eb-037843ae0bc6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-zzvqq\" (UID: \"1cc20998-b9bf-498c-85eb-037843ae0bc6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.432090 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.435700 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pjh2l" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.449490 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" Mar 20 14:54:54 crc kubenswrapper[4764]: W0320 14:54:54.461350 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ca73323_8a03_4812_8cbd_5f22cd297759.slice/crio-c103c35f287076818e30c66dcd2256c627f2b50196d4939fbdb20493180ea477 WatchSource:0}: Error finding container c103c35f287076818e30c66dcd2256c627f2b50196d4939fbdb20493180ea477: Status 404 returned error can't find the container with id c103c35f287076818e30c66dcd2256c627f2b50196d4939fbdb20493180ea477 Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.468123 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fjxr\" (UniqueName: \"kubernetes.io/projected/08bd5c50-7656-4a0a-9d9e-9f79eead7527-kube-api-access-5fjxr\") pod \"control-plane-machine-set-operator-78cbb6b69f-xmwp9\" (UID: \"08bd5c50-7656-4a0a-9d9e-9f79eead7527\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.470297 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mprhh\" (UniqueName: \"kubernetes.io/projected/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-kube-api-access-mprhh\") pod \"oauth-openshift-558db77b4-kctmb\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.483206 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.485204 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsvzc\" (UniqueName: \"kubernetes.io/projected/d720bf9a-7a1a-422b-b20e-158635d6f293-kube-api-access-tsvzc\") pod \"kube-storage-version-migrator-operator-b67b599dd-l4j5r\" (UID: \"d720bf9a-7a1a-422b-b20e-158635d6f293\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.504809 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1d1d43b-6e4a-404c-bcb0-48fded5252b7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-jxhgx\" (UID: \"e1d1d43b-6e4a-404c-bcb0-48fded5252b7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.517934 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt45n\" (UniqueName: \"kubernetes.io/projected/82463101-a3d9-4a1b-a180-aba0318fbeb4-kube-api-access-vt45n\") pod \"console-f9d7485db-5z97v\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.521582 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.534700 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h56z\" (UniqueName: \"kubernetes.io/projected/b151a2bb-8540-48a9-85f6-d8d020bd3d89-kube-api-access-2h56z\") pod \"console-operator-58897d9998-qxxq7\" (UID: \"b151a2bb-8540-48a9-85f6-d8d020bd3d89\") " pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.551083 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j685\" (UniqueName: \"kubernetes.io/projected/54649431-46e8-4a08-a142-6a281092660b-kube-api-access-5j685\") pod \"openshift-controller-manager-operator-756b6f6bc6-6c58z\" (UID: \"54649431-46e8-4a08-a142-6a281092660b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.578511 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7btl\" (UniqueName: \"kubernetes.io/projected/ea5fe851-776a-4cc8-b365-dea09cc3467a-kube-api-access-z7btl\") pod \"machine-config-operator-74547568cd-q9j5z\" (UID: \"ea5fe851-776a-4cc8-b365-dea09cc3467a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.591908 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktrws\" (UniqueName: \"kubernetes.io/projected/0c6ef043-f571-4aff-90e8-a07752e9086c-kube-api-access-ktrws\") pod \"authentication-operator-69f744f599-l759h\" (UID: \"0c6ef043-f571-4aff-90e8-a07752e9086c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.612208 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjln8\" (UniqueName: \"kubernetes.io/projected/fe82f329-50db-4717-aa9b-6245253449cf-kube-api-access-xjln8\") pod \"downloads-7954f5f757-752qt\" (UID: \"fe82f329-50db-4717-aa9b-6245253449cf\") " pod="openshift-console/downloads-7954f5f757-752qt" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.612540 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-752qt" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.624102 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.630611 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.636682 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c7xc\" (UniqueName: \"kubernetes.io/projected/1cc20998-b9bf-498c-85eb-037843ae0bc6-kube-api-access-4c7xc\") pod \"ingress-operator-5b745b69d9-zzvqq\" (UID: \"1cc20998-b9bf-498c-85eb-037843ae0bc6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.645236 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.652738 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5d60062-e5c3-4d3c-bae9-7c3272c16a17-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9z78m\" (UID: \"c5d60062-e5c3-4d3c-bae9-7c3272c16a17\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.654264 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-8gwvk"] Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.656818 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.663734 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.675000 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vwgk\" (UniqueName: \"kubernetes.io/projected/1eb8df89-b3e4-4686-b0ed-b4ff1f840c66-kube-api-access-7vwgk\") pod \"dns-operator-744455d44c-mvhpw\" (UID: \"1eb8df89-b3e4-4686-b0ed-b4ff1f840c66\") " pod="openshift-dns-operator/dns-operator-744455d44c-mvhpw" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.690841 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j6wc\" (UniqueName: \"kubernetes.io/projected/b1c0c7a8-94b2-434d-9680-31ba9ddcc723-kube-api-access-7j6wc\") pod \"machine-config-controller-84d6567774-tbxfl\" (UID: \"b1c0c7a8-94b2-434d-9680-31ba9ddcc723\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.707432 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.710667 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvp26\" (UniqueName: \"kubernetes.io/projected/fb145565-51bb-4217-b1c0-fec824da2124-kube-api-access-cvp26\") pod \"router-default-5444994796-jpz9j\" (UID: \"fb145565-51bb-4217-b1c0-fec824da2124\") " pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.719716 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.722970 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.728261 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x"] Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.730945 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.736951 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc7mb\" (UniqueName: \"kubernetes.io/projected/d06b4728-f677-451f-9b6a-23055e2dde6f-kube-api-access-dc7mb\") pod \"openshift-config-operator-7777fb866f-hf2cz\" (UID: \"d06b4728-f677-451f-9b6a-23055e2dde6f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.743215 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.746069 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cmp9t"] Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.748040 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.750903 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45d8bf31-ce5e-4f52-9394-1711d8b1f060-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fln9z\" (UID: \"45d8bf31-ce5e-4f52-9394-1711d8b1f060\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.760776 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.770786 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/667df1e6-264f-40c2-a45f-f50c1cf0b88a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wqjvc\" (UID: \"667df1e6-264f-40c2-a45f-f50c1cf0b88a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.779315 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 14:54:54 crc kubenswrapper[4764]: W0320 14:54:54.784919 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c51fd09_c129_48bf_9bf8_2d455b230386.slice/crio-36c0e4b8e348637947c47c961458e90adbf130ba7ed78ab6916acc4d073ede96 WatchSource:0}: Error finding container 36c0e4b8e348637947c47c961458e90adbf130ba7ed78ab6916acc4d073ede96: Status 404 returned error can't find the container with id 36c0e4b8e348637947c47c961458e90adbf130ba7ed78ab6916acc4d073ede96 Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.797809 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.805611 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v94wn"] Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.816884 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.837112 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.858821 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.877625 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.905075 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv"] Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.939295 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mvhpw" Mar 20 14:54:54 crc kubenswrapper[4764]: W0320 14:54:54.945114 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8fd45b5_c11a_43c7_a1f2_fb4d4f8e5fd6.slice/crio-1dfd77f191ff47fc2ebbe5ba9bdbc55209003cc7bc385e4f5c7292191f90033e WatchSource:0}: Error finding container 1dfd77f191ff47fc2ebbe5ba9bdbc55209003cc7bc385e4f5c7292191f90033e: Status 404 returned error can't find the container with id 1dfd77f191ff47fc2ebbe5ba9bdbc55209003cc7bc385e4f5c7292191f90033e Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.945466 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9519ecde-7ffa-4aba-99a9-fc60b895767b-config\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.945571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-registry-tls\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.945645 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9519ecde-7ffa-4aba-99a9-fc60b895767b-serving-cert\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.945676 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctkhr\" (UniqueName: \"kubernetes.io/projected/9519ecde-7ffa-4aba-99a9-fc60b895767b-kube-api-access-ctkhr\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.945767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9519ecde-7ffa-4aba-99a9-fc60b895767b-etcd-client\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.945834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.945904 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-bound-sa-token\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.945923 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9519ecde-7ffa-4aba-99a9-fc60b895767b-etcd-ca\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.945963 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-trusted-ca\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.945982 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-988cf\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-kube-api-access-988cf\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.946013 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9519ecde-7ffa-4aba-99a9-fc60b895767b-etcd-service-ca\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.946056 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.946081 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-registry-certificates\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.946126 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:54 crc kubenswrapper[4764]: E0320 14:54:54.946502 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:55.446487778 +0000 UTC m=+217.062676907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.970119 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.977275 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.984622 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:54 crc kubenswrapper[4764]: I0320 14:54:54.989321 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.005051 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-752qt"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.047249 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.047482 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-csi-data-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.047542 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.047577 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/143a8092-b930-4ffb-8414-eda1d808fb8c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hkbns\" (UID: \"143a8092-b930-4ffb-8414-eda1d808fb8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.047616 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac15f1c4-c565-46d8-af08-ba5f7736fe04-srv-cert\") pod \"olm-operator-6b444d44fb-vrpsq\" (UID: \"ac15f1c4-c565-46d8-af08-ba5f7736fe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.047634 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac15f1c4-c565-46d8-af08-ba5f7736fe04-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vrpsq\" (UID: \"ac15f1c4-c565-46d8-af08-ba5f7736fe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.047651 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbfmt\" (UniqueName: \"kubernetes.io/projected/f8fb19fb-3696-4328-898f-dcef4558a658-kube-api-access-bbfmt\") pod \"dns-default-cs6rb\" (UID: \"f8fb19fb-3696-4328-898f-dcef4558a658\") " pod="openshift-dns/dns-default-cs6rb" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.047678 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlzb7\" (UniqueName: \"kubernetes.io/projected/0e7aac7a-18b6-4f47-8f67-05df35c07fd2-kube-api-access-tlzb7\") pod \"multus-admission-controller-857f4d67dd-vfjq2\" (UID: \"0e7aac7a-18b6-4f47-8f67-05df35c07fd2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vfjq2" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.047729 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkf45\" (UniqueName: \"kubernetes.io/projected/483ebf0b-1701-46df-8a4f-281688694851-kube-api-access-hkf45\") pod \"machine-config-server-qbtvq\" (UID: \"483ebf0b-1701-46df-8a4f-281688694851\") " pod="openshift-machine-config-operator/machine-config-server-qbtvq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.047753 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-bound-sa-token\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.048915 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9519ecde-7ffa-4aba-99a9-fc60b895767b-etcd-ca\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.048987 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:55.548965692 +0000 UTC m=+217.165154821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.049166 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e7aac7a-18b6-4f47-8f67-05df35c07fd2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vfjq2\" (UID: \"0e7aac7a-18b6-4f47-8f67-05df35c07fd2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vfjq2" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.049193 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69t7v\" (UniqueName: \"kubernetes.io/projected/7000b862-69fa-4708-b74a-2511f65c9569-kube-api-access-69t7v\") pod \"catalog-operator-68c6474976-4m2jn\" (UID: \"7000b862-69fa-4708-b74a-2511f65c9569\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.049350 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-trusted-ca\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.049430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-988cf\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-kube-api-access-988cf\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.049529 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b2642181-781e-4192-9b05-406b0f97c44a-signing-cabundle\") pod \"service-ca-9c57cc56f-vzfwv\" (UID: \"b2642181-781e-4192-9b05-406b0f97c44a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.049568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9519ecde-7ffa-4aba-99a9-fc60b895767b-etcd-ca\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.049649 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9dw2\" (UniqueName: \"kubernetes.io/projected/0cd6640b-2f6c-4900-b850-bda7f5c9ae6c-kube-api-access-z9dw2\") pod \"service-ca-operator-777779d784-47trd\" (UID: \"0cd6640b-2f6c-4900-b850-bda7f5c9ae6c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.049669 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9-tmpfs\") pod \"packageserver-d55dfcdfc-slnm5\" (UID: \"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.049835 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhgmh\" (UniqueName: \"kubernetes.io/projected/1a20779e-1d3a-4c81-86c7-3248b50c8118-kube-api-access-zhgmh\") pod \"collect-profiles-29566965-5zjk4\" (UID: \"1a20779e-1d3a-4c81-86c7-3248b50c8118\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.049860 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/143a8092-b930-4ffb-8414-eda1d808fb8c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hkbns\" (UID: \"143a8092-b930-4ffb-8414-eda1d808fb8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.049951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9519ecde-7ffa-4aba-99a9-fc60b895767b-etcd-service-ca\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.049982 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/483ebf0b-1701-46df-8a4f-281688694851-node-bootstrap-token\") pod \"machine-config-server-qbtvq\" (UID: \"483ebf0b-1701-46df-8a4f-281688694851\") " pod="openshift-machine-config-operator/machine-config-server-qbtvq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.050001 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cd6640b-2f6c-4900-b850-bda7f5c9ae6c-serving-cert\") pod \"service-ca-operator-777779d784-47trd\" (UID: \"0cd6640b-2f6c-4900-b850-bda7f5c9ae6c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.050022 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.050041 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7000b862-69fa-4708-b74a-2511f65c9569-profile-collector-cert\") pod \"catalog-operator-68c6474976-4m2jn\" (UID: \"7000b862-69fa-4708-b74a-2511f65c9569\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.050078 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-registry-certificates\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.050131 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.050879 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9519ecde-7ffa-4aba-99a9-fc60b895767b-etcd-service-ca\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.051878 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-registry-certificates\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.052114 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-trusted-ca\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.052182 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:55.552171895 +0000 UTC m=+217.168361024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.052204 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfcw8\" (UniqueName: \"kubernetes.io/projected/b2642181-781e-4192-9b05-406b0f97c44a-kube-api-access-cfcw8\") pod \"service-ca-9c57cc56f-vzfwv\" (UID: \"b2642181-781e-4192-9b05-406b0f97c44a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.052591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl6v9\" (UniqueName: \"kubernetes.io/projected/143a8092-b930-4ffb-8414-eda1d808fb8c-kube-api-access-rl6v9\") pod \"marketplace-operator-79b997595-hkbns\" (UID: \"143a8092-b930-4ffb-8414-eda1d808fb8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.052726 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9519ecde-7ffa-4aba-99a9-fc60b895767b-config\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.053183 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-socket-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.053670 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-registry-tls\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.053717 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a20779e-1d3a-4c81-86c7-3248b50c8118-secret-volume\") pod \"collect-profiles-29566965-5zjk4\" (UID: \"1a20779e-1d3a-4c81-86c7-3248b50c8118\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.053795 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-registration-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.053888 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/03046ce0-bd24-4ad7-968e-8b9652b6bccd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cnd46\" (UID: \"03046ce0-bd24-4ad7-968e-8b9652b6bccd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.054400 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.055472 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmd9z\" (UniqueName: \"kubernetes.io/projected/ac15f1c4-c565-46d8-af08-ba5f7736fe04-kube-api-access-zmd9z\") pod \"olm-operator-6b444d44fb-vrpsq\" (UID: \"ac15f1c4-c565-46d8-af08-ba5f7736fe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.055522 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7000b862-69fa-4708-b74a-2511f65c9569-srv-cert\") pod \"catalog-operator-68c6474976-4m2jn\" (UID: \"7000b862-69fa-4708-b74a-2511f65c9569\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.055773 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjxcd\" (UniqueName: \"kubernetes.io/projected/1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9-kube-api-access-cjxcd\") pod \"packageserver-d55dfcdfc-slnm5\" (UID: \"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.055880 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8fb19fb-3696-4328-898f-dcef4558a658-config-volume\") pod \"dns-default-cs6rb\" (UID: \"f8fb19fb-3696-4328-898f-dcef4558a658\") " pod="openshift-dns/dns-default-cs6rb" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.055925 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9519ecde-7ffa-4aba-99a9-fc60b895767b-serving-cert\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.056101 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9519ecde-7ffa-4aba-99a9-fc60b895767b-config\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.056194 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkhr\" (UniqueName: \"kubernetes.io/projected/9519ecde-7ffa-4aba-99a9-fc60b895767b-kube-api-access-ctkhr\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.056221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cd6640b-2f6c-4900-b850-bda7f5c9ae6c-config\") pod \"service-ca-operator-777779d784-47trd\" (UID: \"0cd6640b-2f6c-4900-b850-bda7f5c9ae6c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.056263 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrj97\" (UniqueName: \"kubernetes.io/projected/03046ce0-bd24-4ad7-968e-8b9652b6bccd-kube-api-access-jrj97\") pod \"package-server-manager-789f6589d5-cnd46\" (UID: \"03046ce0-bd24-4ad7-968e-8b9652b6bccd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.056415 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-plugins-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.056440 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wr87\" (UniqueName: \"kubernetes.io/projected/49509900-efe2-4bbb-b89e-5666cec1caf2-kube-api-access-4wr87\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.056672 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9519ecde-7ffa-4aba-99a9-fc60b895767b-etcd-client\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.056717 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a20779e-1d3a-4c81-86c7-3248b50c8118-config-volume\") pod \"collect-profiles-29566965-5zjk4\" (UID: \"1a20779e-1d3a-4c81-86c7-3248b50c8118\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.057093 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjzlq\" (UniqueName: \"kubernetes.io/projected/ba8337ed-07c7-4cb3-91bd-899ce9da7a29-kube-api-access-fjzlq\") pod \"ingress-canary-9r7l6\" (UID: \"ba8337ed-07c7-4cb3-91bd-899ce9da7a29\") " pod="openshift-ingress-canary/ingress-canary-9r7l6" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.057132 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8fb19fb-3696-4328-898f-dcef4558a658-metrics-tls\") pod \"dns-default-cs6rb\" (UID: \"f8fb19fb-3696-4328-898f-dcef4558a658\") " pod="openshift-dns/dns-default-cs6rb" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.057174 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-mountpoint-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.057304 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba8337ed-07c7-4cb3-91bd-899ce9da7a29-cert\") pod \"ingress-canary-9r7l6\" (UID: \"ba8337ed-07c7-4cb3-91bd-899ce9da7a29\") " pod="openshift-ingress-canary/ingress-canary-9r7l6" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.057359 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/483ebf0b-1701-46df-8a4f-281688694851-certs\") pod \"machine-config-server-qbtvq\" (UID: \"483ebf0b-1701-46df-8a4f-281688694851\") " pod="openshift-machine-config-operator/machine-config-server-qbtvq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.057398 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-486kc\" (UniqueName: \"kubernetes.io/projected/1f3afda6-923c-403a-994d-996da0ad0fee-kube-api-access-486kc\") pod \"auto-csr-approver-29566974-5687l\" (UID: \"1f3afda6-923c-403a-994d-996da0ad0fee\") " pod="openshift-infra/auto-csr-approver-29566974-5687l" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.057442 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9-webhook-cert\") pod \"packageserver-d55dfcdfc-slnm5\" (UID: \"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.057460 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b2642181-781e-4192-9b05-406b0f97c44a-signing-key\") pod \"service-ca-9c57cc56f-vzfwv\" (UID: \"b2642181-781e-4192-9b05-406b0f97c44a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.057512 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9-apiservice-cert\") pod \"packageserver-d55dfcdfc-slnm5\" (UID: \"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.060187 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.060979 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9519ecde-7ffa-4aba-99a9-fc60b895767b-etcd-client\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.061426 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-registry-tls\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.062118 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.063781 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-pjh2l"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.065715 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.068931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9519ecde-7ffa-4aba-99a9-fc60b895767b-serving-cert\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.088107 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-bound-sa-token\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.119333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-988cf\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-kube-api-access-988cf\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.159732 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.159859 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:55.659811682 +0000 UTC m=+217.276000811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cd6640b-2f6c-4900-b850-bda7f5c9ae6c-config\") pod \"service-ca-operator-777779d784-47trd\" (UID: \"0cd6640b-2f6c-4900-b850-bda7f5c9ae6c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160256 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrj97\" (UniqueName: \"kubernetes.io/projected/03046ce0-bd24-4ad7-968e-8b9652b6bccd-kube-api-access-jrj97\") pod \"package-server-manager-789f6589d5-cnd46\" (UID: \"03046ce0-bd24-4ad7-968e-8b9652b6bccd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160279 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wr87\" (UniqueName: \"kubernetes.io/projected/49509900-efe2-4bbb-b89e-5666cec1caf2-kube-api-access-4wr87\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160297 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-plugins-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160316 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a20779e-1d3a-4c81-86c7-3248b50c8118-config-volume\") pod \"collect-profiles-29566965-5zjk4\" (UID: \"1a20779e-1d3a-4c81-86c7-3248b50c8118\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160337 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjzlq\" (UniqueName: \"kubernetes.io/projected/ba8337ed-07c7-4cb3-91bd-899ce9da7a29-kube-api-access-fjzlq\") pod \"ingress-canary-9r7l6\" (UID: \"ba8337ed-07c7-4cb3-91bd-899ce9da7a29\") " pod="openshift-ingress-canary/ingress-canary-9r7l6" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160353 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8fb19fb-3696-4328-898f-dcef4558a658-metrics-tls\") pod \"dns-default-cs6rb\" (UID: \"f8fb19fb-3696-4328-898f-dcef4558a658\") " pod="openshift-dns/dns-default-cs6rb" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160370 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-mountpoint-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160401 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba8337ed-07c7-4cb3-91bd-899ce9da7a29-cert\") pod \"ingress-canary-9r7l6\" (UID: \"ba8337ed-07c7-4cb3-91bd-899ce9da7a29\") " pod="openshift-ingress-canary/ingress-canary-9r7l6" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/483ebf0b-1701-46df-8a4f-281688694851-certs\") pod \"machine-config-server-qbtvq\" (UID: \"483ebf0b-1701-46df-8a4f-281688694851\") " pod="openshift-machine-config-operator/machine-config-server-qbtvq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-486kc\" (UniqueName: \"kubernetes.io/projected/1f3afda6-923c-403a-994d-996da0ad0fee-kube-api-access-486kc\") pod \"auto-csr-approver-29566974-5687l\" (UID: \"1f3afda6-923c-403a-994d-996da0ad0fee\") " pod="openshift-infra/auto-csr-approver-29566974-5687l" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160452 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9-webhook-cert\") pod \"packageserver-d55dfcdfc-slnm5\" (UID: \"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b2642181-781e-4192-9b05-406b0f97c44a-signing-key\") pod \"service-ca-9c57cc56f-vzfwv\" (UID: \"b2642181-781e-4192-9b05-406b0f97c44a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160483 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9-apiservice-cert\") pod \"packageserver-d55dfcdfc-slnm5\" (UID: \"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-csi-data-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/143a8092-b930-4ffb-8414-eda1d808fb8c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hkbns\" (UID: \"143a8092-b930-4ffb-8414-eda1d808fb8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbfmt\" (UniqueName: \"kubernetes.io/projected/f8fb19fb-3696-4328-898f-dcef4558a658-kube-api-access-bbfmt\") pod \"dns-default-cs6rb\" (UID: \"f8fb19fb-3696-4328-898f-dcef4558a658\") " pod="openshift-dns/dns-default-cs6rb" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160566 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac15f1c4-c565-46d8-af08-ba5f7736fe04-srv-cert\") pod \"olm-operator-6b444d44fb-vrpsq\" (UID: \"ac15f1c4-c565-46d8-af08-ba5f7736fe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac15f1c4-c565-46d8-af08-ba5f7736fe04-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vrpsq\" (UID: \"ac15f1c4-c565-46d8-af08-ba5f7736fe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160598 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlzb7\" (UniqueName: \"kubernetes.io/projected/0e7aac7a-18b6-4f47-8f67-05df35c07fd2-kube-api-access-tlzb7\") pod \"multus-admission-controller-857f4d67dd-vfjq2\" (UID: \"0e7aac7a-18b6-4f47-8f67-05df35c07fd2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vfjq2" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160614 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkf45\" (UniqueName: \"kubernetes.io/projected/483ebf0b-1701-46df-8a4f-281688694851-kube-api-access-hkf45\") pod \"machine-config-server-qbtvq\" (UID: \"483ebf0b-1701-46df-8a4f-281688694851\") " pod="openshift-machine-config-operator/machine-config-server-qbtvq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160633 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69t7v\" (UniqueName: \"kubernetes.io/projected/7000b862-69fa-4708-b74a-2511f65c9569-kube-api-access-69t7v\") pod \"catalog-operator-68c6474976-4m2jn\" (UID: \"7000b862-69fa-4708-b74a-2511f65c9569\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160648 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e7aac7a-18b6-4f47-8f67-05df35c07fd2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vfjq2\" (UID: \"0e7aac7a-18b6-4f47-8f67-05df35c07fd2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vfjq2" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160671 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b2642181-781e-4192-9b05-406b0f97c44a-signing-cabundle\") pod \"service-ca-9c57cc56f-vzfwv\" (UID: \"b2642181-781e-4192-9b05-406b0f97c44a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160687 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhgmh\" (UniqueName: \"kubernetes.io/projected/1a20779e-1d3a-4c81-86c7-3248b50c8118-kube-api-access-zhgmh\") pod \"collect-profiles-29566965-5zjk4\" (UID: \"1a20779e-1d3a-4c81-86c7-3248b50c8118\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160703 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9dw2\" (UniqueName: \"kubernetes.io/projected/0cd6640b-2f6c-4900-b850-bda7f5c9ae6c-kube-api-access-z9dw2\") pod \"service-ca-operator-777779d784-47trd\" (UID: \"0cd6640b-2f6c-4900-b850-bda7f5c9ae6c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160717 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9-tmpfs\") pod \"packageserver-d55dfcdfc-slnm5\" (UID: \"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160731 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/143a8092-b930-4ffb-8414-eda1d808fb8c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hkbns\" (UID: \"143a8092-b930-4ffb-8414-eda1d808fb8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160747 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/483ebf0b-1701-46df-8a4f-281688694851-node-bootstrap-token\") pod \"machine-config-server-qbtvq\" (UID: \"483ebf0b-1701-46df-8a4f-281688694851\") " pod="openshift-machine-config-operator/machine-config-server-qbtvq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cd6640b-2f6c-4900-b850-bda7f5c9ae6c-serving-cert\") pod \"service-ca-operator-777779d784-47trd\" (UID: \"0cd6640b-2f6c-4900-b850-bda7f5c9ae6c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160778 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7000b862-69fa-4708-b74a-2511f65c9569-profile-collector-cert\") pod \"catalog-operator-68c6474976-4m2jn\" (UID: \"7000b862-69fa-4708-b74a-2511f65c9569\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160798 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160821 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfcw8\" (UniqueName: \"kubernetes.io/projected/b2642181-781e-4192-9b05-406b0f97c44a-kube-api-access-cfcw8\") pod \"service-ca-9c57cc56f-vzfwv\" (UID: \"b2642181-781e-4192-9b05-406b0f97c44a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160849 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl6v9\" (UniqueName: \"kubernetes.io/projected/143a8092-b930-4ffb-8414-eda1d808fb8c-kube-api-access-rl6v9\") pod \"marketplace-operator-79b997595-hkbns\" (UID: \"143a8092-b930-4ffb-8414-eda1d808fb8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160857 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-csi-data-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160883 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-socket-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a20779e-1d3a-4c81-86c7-3248b50c8118-secret-volume\") pod \"collect-profiles-29566965-5zjk4\" (UID: \"1a20779e-1d3a-4c81-86c7-3248b50c8118\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-mountpoint-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.160920 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-registration-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.162115 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cd6640b-2f6c-4900-b850-bda7f5c9ae6c-config\") pod \"service-ca-operator-777779d784-47trd\" (UID: \"0cd6640b-2f6c-4900-b850-bda7f5c9ae6c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.165411 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-plugins-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.166192 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a20779e-1d3a-4c81-86c7-3248b50c8118-config-volume\") pod \"collect-profiles-29566965-5zjk4\" (UID: \"1a20779e-1d3a-4c81-86c7-3248b50c8118\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.166556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/03046ce0-bd24-4ad7-968e-8b9652b6bccd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cnd46\" (UID: \"03046ce0-bd24-4ad7-968e-8b9652b6bccd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.166593 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmd9z\" (UniqueName: \"kubernetes.io/projected/ac15f1c4-c565-46d8-af08-ba5f7736fe04-kube-api-access-zmd9z\") pod \"olm-operator-6b444d44fb-vrpsq\" (UID: \"ac15f1c4-c565-46d8-af08-ba5f7736fe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.166610 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7000b862-69fa-4708-b74a-2511f65c9569-srv-cert\") pod \"catalog-operator-68c6474976-4m2jn\" (UID: \"7000b862-69fa-4708-b74a-2511f65c9569\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.166638 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxcd\" (UniqueName: \"kubernetes.io/projected/1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9-kube-api-access-cjxcd\") pod \"packageserver-d55dfcdfc-slnm5\" (UID: \"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.166667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8fb19fb-3696-4328-898f-dcef4558a658-config-volume\") pod \"dns-default-cs6rb\" (UID: \"f8fb19fb-3696-4328-898f-dcef4558a658\") " pod="openshift-dns/dns-default-cs6rb" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.167393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctkhr\" (UniqueName: \"kubernetes.io/projected/9519ecde-7ffa-4aba-99a9-fc60b895767b-kube-api-access-ctkhr\") pod \"etcd-operator-b45778765-sppmr\" (UID: \"9519ecde-7ffa-4aba-99a9-fc60b895767b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.168239 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b2642181-781e-4192-9b05-406b0f97c44a-signing-cabundle\") pod \"service-ca-9c57cc56f-vzfwv\" (UID: \"b2642181-781e-4192-9b05-406b0f97c44a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.168460 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9-tmpfs\") pod \"packageserver-d55dfcdfc-slnm5\" (UID: \"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.168793 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-socket-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.169181 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49509900-efe2-4bbb-b89e-5666cec1caf2-registration-dir\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.169210 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:55.669197184 +0000 UTC m=+217.285386313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.170788 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba8337ed-07c7-4cb3-91bd-899ce9da7a29-cert\") pod \"ingress-canary-9r7l6\" (UID: \"ba8337ed-07c7-4cb3-91bd-899ce9da7a29\") " pod="openshift-ingress-canary/ingress-canary-9r7l6" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.172498 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/143a8092-b930-4ffb-8414-eda1d808fb8c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hkbns\" (UID: \"143a8092-b930-4ffb-8414-eda1d808fb8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.172587 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8fb19fb-3696-4328-898f-dcef4558a658-config-volume\") pod \"dns-default-cs6rb\" (UID: \"f8fb19fb-3696-4328-898f-dcef4558a658\") " pod="openshift-dns/dns-default-cs6rb" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.174830 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8fb19fb-3696-4328-898f-dcef4558a658-metrics-tls\") pod \"dns-default-cs6rb\" (UID: \"f8fb19fb-3696-4328-898f-dcef4558a658\") " pod="openshift-dns/dns-default-cs6rb" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.177931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/483ebf0b-1701-46df-8a4f-281688694851-node-bootstrap-token\") pod \"machine-config-server-qbtvq\" (UID: \"483ebf0b-1701-46df-8a4f-281688694851\") " pod="openshift-machine-config-operator/machine-config-server-qbtvq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.181534 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7000b862-69fa-4708-b74a-2511f65c9569-srv-cert\") pod \"catalog-operator-68c6474976-4m2jn\" (UID: \"7000b862-69fa-4708-b74a-2511f65c9569\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.183173 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b2642181-781e-4192-9b05-406b0f97c44a-signing-key\") pod \"service-ca-9c57cc56f-vzfwv\" (UID: \"b2642181-781e-4192-9b05-406b0f97c44a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.185158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a20779e-1d3a-4c81-86c7-3248b50c8118-secret-volume\") pod \"collect-profiles-29566965-5zjk4\" (UID: \"1a20779e-1d3a-4c81-86c7-3248b50c8118\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.185893 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0e7aac7a-18b6-4f47-8f67-05df35c07fd2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vfjq2\" (UID: \"0e7aac7a-18b6-4f47-8f67-05df35c07fd2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vfjq2" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.185932 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9-webhook-cert\") pod \"packageserver-d55dfcdfc-slnm5\" (UID: \"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.186059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/143a8092-b930-4ffb-8414-eda1d808fb8c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hkbns\" (UID: \"143a8092-b930-4ffb-8414-eda1d808fb8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.186316 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/483ebf0b-1701-46df-8a4f-281688694851-certs\") pod \"machine-config-server-qbtvq\" (UID: \"483ebf0b-1701-46df-8a4f-281688694851\") " pod="openshift-machine-config-operator/machine-config-server-qbtvq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.186559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9-apiservice-cert\") pod \"packageserver-d55dfcdfc-slnm5\" (UID: \"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.187256 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac15f1c4-c565-46d8-af08-ba5f7736fe04-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vrpsq\" (UID: \"ac15f1c4-c565-46d8-af08-ba5f7736fe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.188298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac15f1c4-c565-46d8-af08-ba5f7736fe04-srv-cert\") pod \"olm-operator-6b444d44fb-vrpsq\" (UID: \"ac15f1c4-c565-46d8-af08-ba5f7736fe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.190222 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cd6640b-2f6c-4900-b850-bda7f5c9ae6c-serving-cert\") pod \"service-ca-operator-777779d784-47trd\" (UID: \"0cd6640b-2f6c-4900-b850-bda7f5c9ae6c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.190713 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/03046ce0-bd24-4ad7-968e-8b9652b6bccd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cnd46\" (UID: \"03046ce0-bd24-4ad7-968e-8b9652b6bccd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.192965 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7000b862-69fa-4708-b74a-2511f65c9569-profile-collector-cert\") pod \"catalog-operator-68c6474976-4m2jn\" (UID: \"7000b862-69fa-4708-b74a-2511f65c9569\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.216211 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9dw2\" (UniqueName: \"kubernetes.io/projected/0cd6640b-2f6c-4900-b850-bda7f5c9ae6c-kube-api-access-z9dw2\") pod \"service-ca-operator-777779d784-47trd\" (UID: \"0cd6640b-2f6c-4900-b850-bda7f5c9ae6c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.251728 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.251854 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wr87\" (UniqueName: \"kubernetes.io/projected/49509900-efe2-4bbb-b89e-5666cec1caf2-kube-api-access-4wr87\") pod \"csi-hostpathplugin-x8hb8\" (UID: \"49509900-efe2-4bbb-b89e-5666cec1caf2\") " pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.254285 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qxxq7"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.257979 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrj97\" (UniqueName: \"kubernetes.io/projected/03046ce0-bd24-4ad7-968e-8b9652b6bccd-kube-api-access-jrj97\") pod \"package-server-manager-789f6589d5-cnd46\" (UID: \"03046ce0-bd24-4ad7-968e-8b9652b6bccd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.268020 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.268362 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:55.76834309 +0000 UTC m=+217.384532219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.270714 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.272973 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5z97v"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.276607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkf45\" (UniqueName: \"kubernetes.io/projected/483ebf0b-1701-46df-8a4f-281688694851-kube-api-access-hkf45\") pod \"machine-config-server-qbtvq\" (UID: \"483ebf0b-1701-46df-8a4f-281688694851\") " pod="openshift-machine-config-operator/machine-config-server-qbtvq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.292596 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kctmb"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.298911 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbfmt\" (UniqueName: \"kubernetes.io/projected/f8fb19fb-3696-4328-898f-dcef4558a658-kube-api-access-bbfmt\") pod \"dns-default-cs6rb\" (UID: \"f8fb19fb-3696-4328-898f-dcef4558a658\") " pod="openshift-dns/dns-default-cs6rb" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.301351 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z"] Mar 20 14:54:55 crc kubenswrapper[4764]: W0320 14:54:55.314216 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1d1d43b_6e4a_404c_bcb0_48fded5252b7.slice/crio-fbd95bd217468754a233952c1e8cbace619c6bf05ef8f4c3b1b165229113c871 WatchSource:0}: Error finding container fbd95bd217468754a233952c1e8cbace619c6bf05ef8f4c3b1b165229113c871: Status 404 returned error can't find the container with id fbd95bd217468754a233952c1e8cbace619c6bf05ef8f4c3b1b165229113c871 Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.328276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlzb7\" (UniqueName: \"kubernetes.io/projected/0e7aac7a-18b6-4f47-8f67-05df35c07fd2-kube-api-access-tlzb7\") pod \"multus-admission-controller-857f4d67dd-vfjq2\" (UID: \"0e7aac7a-18b6-4f47-8f67-05df35c07fd2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vfjq2" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.337127 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69t7v\" (UniqueName: \"kubernetes.io/projected/7000b862-69fa-4708-b74a-2511f65c9569-kube-api-access-69t7v\") pod \"catalog-operator-68c6474976-4m2jn\" (UID: \"7000b862-69fa-4708-b74a-2511f65c9569\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.349881 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjzlq\" (UniqueName: \"kubernetes.io/projected/ba8337ed-07c7-4cb3-91bd-899ce9da7a29-kube-api-access-fjzlq\") pod \"ingress-canary-9r7l6\" (UID: \"ba8337ed-07c7-4cb3-91bd-899ce9da7a29\") " pod="openshift-ingress-canary/ingress-canary-9r7l6" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.369304 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.369914 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:55.869897762 +0000 UTC m=+217.486086891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.373869 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" event={"ID":"e1d1d43b-6e4a-404c-bcb0-48fded5252b7","Type":"ContainerStarted","Data":"fbd95bd217468754a233952c1e8cbace619c6bf05ef8f4c3b1b165229113c871"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.375306 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhgmh\" (UniqueName: \"kubernetes.io/projected/1a20779e-1d3a-4c81-86c7-3248b50c8118-kube-api-access-zhgmh\") pod \"collect-profiles-29566965-5zjk4\" (UID: \"1a20779e-1d3a-4c81-86c7-3248b50c8118\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.378959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" event={"ID":"3ca73323-8a03-4812-8cbd-5f22cd297759","Type":"ContainerStarted","Data":"0d63d7b276ac0a41d164ced32cdbbf14edbdbbaad91dc4ef14560603a7da42de"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.379019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" event={"ID":"3ca73323-8a03-4812-8cbd-5f22cd297759","Type":"ContainerStarted","Data":"c103c35f287076818e30c66dcd2256c627f2b50196d4939fbdb20493180ea477"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.391516 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-486kc\" (UniqueName: \"kubernetes.io/projected/1f3afda6-923c-403a-994d-996da0ad0fee-kube-api-access-486kc\") pod \"auto-csr-approver-29566974-5687l\" (UID: \"1f3afda6-923c-403a-994d-996da0ad0fee\") " pod="openshift-infra/auto-csr-approver-29566974-5687l" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.437676 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl6v9\" (UniqueName: \"kubernetes.io/projected/143a8092-b930-4ffb-8414-eda1d808fb8c-kube-api-access-rl6v9\") pod \"marketplace-operator-79b997595-hkbns\" (UID: \"143a8092-b930-4ffb-8414-eda1d808fb8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:54:55 crc kubenswrapper[4764]: W0320 14:54:55.440372 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0b7cf43_9ff0_4ab9_a884_955f2bc6e870.slice/crio-0246170a27cd6a9152ac3f371af28627e89e47cda007cfb6db92b9a0eeeecdbb WatchSource:0}: Error finding container 0246170a27cd6a9152ac3f371af28627e89e47cda007cfb6db92b9a0eeeecdbb: Status 404 returned error can't find the container with id 0246170a27cd6a9152ac3f371af28627e89e47cda007cfb6db92b9a0eeeecdbb Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.444451 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfcw8\" (UniqueName: \"kubernetes.io/projected/b2642181-781e-4192-9b05-406b0f97c44a-kube-api-access-cfcw8\") pod \"service-ca-9c57cc56f-vzfwv\" (UID: \"b2642181-781e-4192-9b05-406b0f97c44a\") " pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.452835 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" event={"ID":"7bc79882-ee25-421d-abfc-7d2684bd348f","Type":"ContainerStarted","Data":"7beed32a6f9d26f8a99d17b954fe3b12f486af40e13a7b9024df434962f49859"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.452872 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" event={"ID":"7bc79882-ee25-421d-abfc-7d2684bd348f","Type":"ContainerStarted","Data":"d3bb951608b9125f5ef730a2629b89475871136d10fb22dd34c6e9b0e53bd04e"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.453056 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566974-5687l" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.453958 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.459070 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.461771 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.461777 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:54:55 crc kubenswrapper[4764]: W0320 14:54:55.463065 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd720bf9a_7a1a_422b_b20e_158635d6f293.slice/crio-2a604cecfcd1b97aba67ffcd631e90dae73a121d2541651f3e949c5ed75aa236 WatchSource:0}: Error finding container 2a604cecfcd1b97aba67ffcd631e90dae73a121d2541651f3e949c5ed75aa236: Status 404 returned error can't find the container with id 2a604cecfcd1b97aba67ffcd631e90dae73a121d2541651f3e949c5ed75aa236 Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.465734 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmd9z\" (UniqueName: \"kubernetes.io/projected/ac15f1c4-c565-46d8-af08-ba5f7736fe04-kube-api-access-zmd9z\") pod \"olm-operator-6b444d44fb-vrpsq\" (UID: \"ac15f1c4-c565-46d8-af08-ba5f7736fe04\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.469998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5z97v" event={"ID":"82463101-a3d9-4a1b-a180-aba0318fbeb4","Type":"ContainerStarted","Data":"afe04cae588f16c1ecda6a0e7c85e24fc865724d668c296e048a95a78e42c2e8"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.470213 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.470326 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.470631 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:55.970615214 +0000 UTC m=+217.586804343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.473070 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjxcd\" (UniqueName: \"kubernetes.io/projected/1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9-kube-api-access-cjxcd\") pod \"packageserver-d55dfcdfc-slnm5\" (UID: \"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.479505 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vfjq2" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.480487 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" event={"ID":"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca","Type":"ContainerStarted","Data":"5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.480515 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" event={"ID":"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca","Type":"ContainerStarted","Data":"6151592aef9500d9865b21c23e99c3880253fd3bd765a4de279dfa47b2816edb"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.481419 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.485277 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pjh2l" event={"ID":"b2d2f76c-cee1-4eba-bf77-a08cc6de3b7b","Type":"ContainerStarted","Data":"6b4fe101e62367222818e8ce5d2d2fbec5d7f17853b0e0dab237ea6f2104f954"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.487936 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.492088 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jpz9j" event={"ID":"fb145565-51bb-4217-b1c0-fec824da2124","Type":"ContainerStarted","Data":"f6e795b7c1b88ce6fa9a74eb363584b2824fb074325886085339f4582093dd56"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.493629 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" event={"ID":"04d1f9f1-ce2a-4c7a-9410-b0b19daa6179","Type":"ContainerStarted","Data":"44adbfb56d387e8c86d140a8977390b4bdf38d9afd89c4ea7ef2483b7bb96fac"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.495272 4764 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-v94wn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.495321 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" podUID="fb3e75f0-54ce-4565-aea0-7c4e7abec3ca" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.495444 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" event={"ID":"3c51fd09-c129-48bf-9bf8-2d455b230386","Type":"ContainerStarted","Data":"a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.495461 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" event={"ID":"3c51fd09-c129-48bf-9bf8-2d455b230386","Type":"ContainerStarted","Data":"36c0e4b8e348637947c47c961458e90adbf130ba7ed78ab6916acc4d073ede96"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.495839 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.500145 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.501270 4764 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-scc4x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.501316 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" podUID="3c51fd09-c129-48bf-9bf8-2d455b230386" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.504976 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.505818 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9r7l6" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.518577 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-752qt" event={"ID":"fe82f329-50db-4717-aa9b-6245253449cf","Type":"ContainerStarted","Data":"90408375d197cf30fd8c0754013d184631eb05cc7a83a4c07bc6ff3913df5471"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.520334 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" event={"ID":"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6","Type":"ContainerStarted","Data":"1dfd77f191ff47fc2ebbe5ba9bdbc55209003cc7bc385e4f5c7292191f90033e"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.526685 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" event={"ID":"83c08f82-bea9-451a-b1a5-a98b77e1502e","Type":"ContainerStarted","Data":"1bd16ae4c446cee3f28cf08388138a409f8fbf5ff0f2aacc144b9fad87b61573"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.526712 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" event={"ID":"83c08f82-bea9-451a-b1a5-a98b77e1502e","Type":"ContainerStarted","Data":"dab8cc7a00cc3abe0d371d605aa2d0446f04107e0f2a2daabd5b103bf6c9edb4"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.527644 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qxxq7" event={"ID":"b151a2bb-8540-48a9-85f6-d8d020bd3d89","Type":"ContainerStarted","Data":"df510583c5b74a4170d177eb90318631119eac1f828195d0fbd76b35077996e7"} Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.540658 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" Mar 20 14:54:55 crc kubenswrapper[4764]: W0320 14:54:55.546458 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd06b4728_f677_451f_9b6a_23055e2dde6f.slice/crio-305fdea57ed78224085b384a2e7316fa8f64836eedefae482272924c8458e5f5 WatchSource:0}: Error finding container 305fdea57ed78224085b384a2e7316fa8f64836eedefae482272924c8458e5f5: Status 404 returned error can't find the container with id 305fdea57ed78224085b384a2e7316fa8f64836eedefae482272924c8458e5f5 Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.551950 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.553275 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cs6rb" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.568496 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-l759h"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.573167 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.574834 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.074815558 +0000 UTC m=+217.691004687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.575018 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qbtvq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.583771 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.607900 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mvhpw"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.614844 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.654005 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.654514 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-sppmr"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.670215 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.673937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.674140 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.174124021 +0000 UTC m=+217.790313150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.674187 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.674915 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.174901508 +0000 UTC m=+217.791090637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.675866 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.681711 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" Mar 20 14:54:55 crc kubenswrapper[4764]: W0320 14:54:55.701339 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c6ef043_f571_4aff_90e8_a07752e9086c.slice/crio-d20945dc018dce2e8543a322d6a301da4fbbc2ce264c14a57815685dbec79d35 WatchSource:0}: Error finding container d20945dc018dce2e8543a322d6a301da4fbbc2ce264c14a57815685dbec79d35: Status 404 returned error can't find the container with id d20945dc018dce2e8543a322d6a301da4fbbc2ce264c14a57815685dbec79d35 Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.708496 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.708787 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.775053 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.775240 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.275219446 +0000 UTC m=+217.891408575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.775431 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.775766 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.275753505 +0000 UTC m=+217.891942634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: W0320 14:54:55.788657 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eb8df89_b3e4_4686_b0ed_b4ff1f840c66.slice/crio-b8ea53593f6462838fb4191ed3c3415dd805e93889a3f37071d6e2296163d706 WatchSource:0}: Error finding container b8ea53593f6462838fb4191ed3c3415dd805e93889a3f37071d6e2296163d706: Status 404 returned error can't find the container with id b8ea53593f6462838fb4191ed3c3415dd805e93889a3f37071d6e2296163d706 Mar 20 14:54:55 crc kubenswrapper[4764]: W0320 14:54:55.806809 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9519ecde_7ffa_4aba_99a9_fc60b895767b.slice/crio-9f771bbb26e14c90b5f94c345a4265aadd4bd1b6cce07a0a0b50eee9e113ca63 WatchSource:0}: Error finding container 9f771bbb26e14c90b5f94c345a4265aadd4bd1b6cce07a0a0b50eee9e113ca63: Status 404 returned error can't find the container with id 9f771bbb26e14c90b5f94c345a4265aadd4bd1b6cce07a0a0b50eee9e113ca63 Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.814677 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.881842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.882021 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.381988672 +0000 UTC m=+217.998177801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.882946 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.883658 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.3836274 +0000 UTC m=+217.999816529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:55 crc kubenswrapper[4764]: W0320 14:54:55.926878 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod667df1e6_264f_40c2_a45f_f50c1cf0b88a.slice/crio-66565e88ac44967a5a53219ca1fa553709e555a914873284a166a5d69a4e246d WatchSource:0}: Error finding container 66565e88ac44967a5a53219ca1fa553709e555a914873284a166a5d69a4e246d: Status 404 returned error can't find the container with id 66565e88ac44967a5a53219ca1fa553709e555a914873284a166a5d69a4e246d Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.974487 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn"] Mar 20 14:54:55 crc kubenswrapper[4764]: I0320 14:54:55.983900 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:55 crc kubenswrapper[4764]: E0320 14:54:55.984346 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.484329631 +0000 UTC m=+218.100518760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:56 crc kubenswrapper[4764]: W0320 14:54:56.063416 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7000b862_69fa_4708_b74a_2511f65c9569.slice/crio-bbea17a904045b0479771ec77edd0db9992a10b51fb0f20c0da4b68e08e3291e WatchSource:0}: Error finding container bbea17a904045b0479771ec77edd0db9992a10b51fb0f20c0da4b68e08e3291e: Status 404 returned error can't find the container with id bbea17a904045b0479771ec77edd0db9992a10b51fb0f20c0da4b68e08e3291e Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.085083 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:56 crc kubenswrapper[4764]: E0320 14:54:56.085396 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.585372665 +0000 UTC m=+218.201561794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.135428 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566974-5687l"] Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.138243 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vfjq2"] Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.185797 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:56 crc kubenswrapper[4764]: E0320 14:54:56.185992 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.685971442 +0000 UTC m=+218.302160571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.186038 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:56 crc kubenswrapper[4764]: E0320 14:54:56.186391 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.686366016 +0000 UTC m=+218.302555145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:56 crc kubenswrapper[4764]: W0320 14:54:56.212497 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f3afda6_923c_403a_994d_996da0ad0fee.slice/crio-18a0e0fd711d12be5c8a1d59575d9c152e3271d36a5e4cfe99295b45d930f004 WatchSource:0}: Error finding container 18a0e0fd711d12be5c8a1d59575d9c152e3271d36a5e4cfe99295b45d930f004: Status 404 returned error can't find the container with id 18a0e0fd711d12be5c8a1d59575d9c152e3271d36a5e4cfe99295b45d930f004 Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.231327 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-47trd"] Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.251445 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.293716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:56 crc kubenswrapper[4764]: E0320 14:54:56.294127 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.794111146 +0000 UTC m=+218.410300275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.322127 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46"] Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.329497 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x8hb8"] Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.378192 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9r7l6"] Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.399208 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:56 crc kubenswrapper[4764]: E0320 14:54:56.400542 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:56.899861707 +0000 UTC m=+218.516050836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.503564 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:56 crc kubenswrapper[4764]: E0320 14:54:56.503828 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.003804332 +0000 UTC m=+218.619993461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.504056 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:56 crc kubenswrapper[4764]: E0320 14:54:56.504562 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.004546178 +0000 UTC m=+218.620735307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.606282 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:56 crc kubenswrapper[4764]: E0320 14:54:56.606919 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.106903289 +0000 UTC m=+218.723092418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.624460 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hkbns"] Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.641970 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pjh2l" event={"ID":"b2d2f76c-cee1-4eba-bf77-a08cc6de3b7b","Type":"ContainerStarted","Data":"71fbb343a445370cec46069c8af604a9160d0964022cae05ff266b4b8ae1cebb"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.657566 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" event={"ID":"1cc20998-b9bf-498c-85eb-037843ae0bc6","Type":"ContainerStarted","Data":"683ed28bc84582af9c3408bd21e0078c4dbee32e26fac1a6f13ec967add77c5a"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.657699 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" event={"ID":"1cc20998-b9bf-498c-85eb-037843ae0bc6","Type":"ContainerStarted","Data":"85b435392d411da632d57b37763e3d09399e217a3ca892ef81b936c696c7ab62"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.662003 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq"] Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.662969 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" event={"ID":"04d1f9f1-ce2a-4c7a-9410-b0b19daa6179","Type":"ContainerStarted","Data":"a9e34ad3deb73d68a46dfe07b5d10cc23220f2b92f1686376a8b325a5b7050a8"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.674089 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" event={"ID":"1a20779e-1d3a-4c81-86c7-3248b50c8118","Type":"ContainerStarted","Data":"655641a7fb6341df89e5739b1fde5b70996487251a706f9b8c34449a7c833833"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.680567 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" event={"ID":"b1c0c7a8-94b2-434d-9680-31ba9ddcc723","Type":"ContainerStarted","Data":"016f9a9b5a0157b28dcbe0c97befd2dd646f9fddeaeb1c0b36948292ab32bc30"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.680649 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" event={"ID":"b1c0c7a8-94b2-434d-9680-31ba9ddcc723","Type":"ContainerStarted","Data":"93ced5e4c3bc0055b9e45d7959cde9a66ec2524d1e53e824860c9befdc4d86a0"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.690600 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9r7l6" event={"ID":"ba8337ed-07c7-4cb3-91bd-899ce9da7a29","Type":"ContainerStarted","Data":"bb4f4a6c2ca31ccc6c30b1dd13e1c98e67f36186b68951d0c17a8ff1a745a756"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.705015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" event={"ID":"45d8bf31-ce5e-4f52-9394-1711d8b1f060","Type":"ContainerStarted","Data":"d26e4c41118a166cae4a6fa17402d8214ee242340fe01029dfde64a76683e858"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.714275 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:56 crc kubenswrapper[4764]: E0320 14:54:56.716512 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.216494305 +0000 UTC m=+218.832683434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.720695 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mvhpw" event={"ID":"1eb8df89-b3e4-4686-b0ed-b4ff1f840c66","Type":"ContainerStarted","Data":"b8ea53593f6462838fb4191ed3c3415dd805e93889a3f37071d6e2296163d706"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.742678 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5"] Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.762311 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cs6rb"] Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.776285 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" event={"ID":"3ca73323-8a03-4812-8cbd-5f22cd297759","Type":"ContainerStarted","Data":"ac3fce268387ae41847ece16bc268a3a0658f64bab4b1cec2812d049b6067258"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.815078 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:56 crc kubenswrapper[4764]: E0320 14:54:56.816699 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.316675677 +0000 UTC m=+218.932864806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.846033 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6" containerID="f1f5c89b0df67587de624f3d123c1f216c84bf1caf69c5e2c4c9ef55421929a8" exitCode=0 Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.846239 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" event={"ID":"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6","Type":"ContainerDied","Data":"f1f5c89b0df67587de624f3d123c1f216c84bf1caf69c5e2c4c9ef55421929a8"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.849919 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vzfwv"] Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.884510 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c8dzl" podStartSLOduration=176.884484376 podStartE2EDuration="2m56.884484376s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:56.87556662 +0000 UTC m=+218.491755749" watchObservedRunningTime="2026-03-20 14:54:56.884484376 +0000 UTC m=+218.500673505" Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.895396 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qxxq7" event={"ID":"b151a2bb-8540-48a9-85f6-d8d020bd3d89","Type":"ContainerStarted","Data":"d841cc552539b86fb1056462ff75d89038c64198e94d9a39cc5553f75cfe55a3"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.895678 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.903071 4764 patch_prober.go:28] interesting pod/console-operator-58897d9998-qxxq7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.903171 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qxxq7" podUID="b151a2bb-8540-48a9-85f6-d8d020bd3d89" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.912778 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" event={"ID":"d06b4728-f677-451f-9b6a-23055e2dde6f","Type":"ContainerStarted","Data":"be1162eb23568853a827c458183deb089fe65349eedc6251e87a859eaadd8d2a"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.912819 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" event={"ID":"d06b4728-f677-451f-9b6a-23055e2dde6f","Type":"ContainerStarted","Data":"305fdea57ed78224085b384a2e7316fa8f64836eedefae482272924c8458e5f5"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.916403 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:56 crc kubenswrapper[4764]: E0320 14:54:56.918089 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.418076364 +0000 UTC m=+219.034265483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.924940 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" event={"ID":"9519ecde-7ffa-4aba-99a9-fc60b895767b","Type":"ContainerStarted","Data":"9f771bbb26e14c90b5f94c345a4265aadd4bd1b6cce07a0a0b50eee9e113ca63"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.934589 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" event={"ID":"7000b862-69fa-4708-b74a-2511f65c9569","Type":"ContainerStarted","Data":"bbea17a904045b0479771ec77edd0db9992a10b51fb0f20c0da4b68e08e3291e"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.944505 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" event={"ID":"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870","Type":"ContainerStarted","Data":"adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.944568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" event={"ID":"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870","Type":"ContainerStarted","Data":"0246170a27cd6a9152ac3f371af28627e89e47cda007cfb6db92b9a0eeeecdbb"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.945270 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.951760 4764 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-kctmb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.951844 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" podUID="c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.953091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" event={"ID":"d720bf9a-7a1a-422b-b20e-158635d6f293","Type":"ContainerStarted","Data":"dc3fa2fb3de00e634042518c1c0b047ee1a8dc3b110aaf25cb9b090b6e24505e"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.953127 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" event={"ID":"d720bf9a-7a1a-422b-b20e-158635d6f293","Type":"ContainerStarted","Data":"2a604cecfcd1b97aba67ffcd631e90dae73a121d2541651f3e949c5ed75aa236"} Mar 20 14:54:56 crc kubenswrapper[4764]: I0320 14:54:56.978302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566974-5687l" event={"ID":"1f3afda6-923c-403a-994d-996da0ad0fee","Type":"ContainerStarted","Data":"18a0e0fd711d12be5c8a1d59575d9c152e3271d36a5e4cfe99295b45d930f004"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.017742 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:57 crc kubenswrapper[4764]: E0320 14:54:57.018763 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.518748624 +0000 UTC m=+219.134937753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.038318 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jpz9j" event={"ID":"fb145565-51bb-4217-b1c0-fec824da2124","Type":"ContainerStarted","Data":"14b7a9e5a06165292e0cbe03cd266147b688b6093a92acfc6217a5e7c8d447fc"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.043179 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" podStartSLOduration=176.043166177 podStartE2EDuration="2m56.043166177s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.013995255 +0000 UTC m=+218.630184384" watchObservedRunningTime="2026-03-20 14:54:57.043166177 +0000 UTC m=+218.659355306" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.068540 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8xtrp" podStartSLOduration=177.068520914 podStartE2EDuration="2m57.068520914s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.046905259 +0000 UTC m=+218.663094378" watchObservedRunningTime="2026-03-20 14:54:57.068520914 +0000 UTC m=+218.684710043" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.069933 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9" event={"ID":"08bd5c50-7656-4a0a-9d9e-9f79eead7527","Type":"ContainerStarted","Data":"fc533e1e18e22dace76260e88291b8f954ae90932ede335813c0bb9100f10b91"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.069957 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9" event={"ID":"08bd5c50-7656-4a0a-9d9e-9f79eead7527","Type":"ContainerStarted","Data":"c3fcfe058142cacd263187d069dd817e8c8ffd5b8ac3f14f0cc1c18f89393f70"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.089304 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" event={"ID":"03046ce0-bd24-4ad7-968e-8b9652b6bccd","Type":"ContainerStarted","Data":"640645f8030b1370b3ff864042544175a04368957f649a8349fcd6ccba1773b4"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.119115 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:57 crc kubenswrapper[4764]: E0320 14:54:57.119596 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.61957307 +0000 UTC m=+219.235762199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.136776 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" podStartSLOduration=176.136757207 podStartE2EDuration="2m56.136757207s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.109914158 +0000 UTC m=+218.726103287" watchObservedRunningTime="2026-03-20 14:54:57.136757207 +0000 UTC m=+218.752946336" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.157472 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-l4j5r" podStartSLOduration=176.157454119 podStartE2EDuration="2m56.157454119s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.155943856 +0000 UTC m=+218.772132985" watchObservedRunningTime="2026-03-20 14:54:57.157454119 +0000 UTC m=+218.773643248" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.172320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" event={"ID":"7bc79882-ee25-421d-abfc-7d2684bd348f","Type":"ContainerStarted","Data":"1f56442f19c5fbedb3754c81f241501c6205d822917e038d57d4a06240b2ce86"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.172473 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" event={"ID":"ea5fe851-776a-4cc8-b365-dea09cc3467a","Type":"ContainerStarted","Data":"2cf48564afab68ea4ba5e79767e4eec55e33762c5b260c8704923e404d374b3c"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.192733 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" event={"ID":"54649431-46e8-4a08-a142-6a281092660b","Type":"ContainerStarted","Data":"b04f0989a13bd963f16b017cb29de3225990d3bcc028e7b776f504ad86cef9c6"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.192763 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" event={"ID":"54649431-46e8-4a08-a142-6a281092660b","Type":"ContainerStarted","Data":"c71678b03c3400b2cbdecc72b8741f7522235858255dfe60dc8e63b03ef4cf7b"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.226976 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:57 crc kubenswrapper[4764]: E0320 14:54:57.227325 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.727301049 +0000 UTC m=+219.343490188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.229556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.232092 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vfjq2" event={"ID":"0e7aac7a-18b6-4f47-8f67-05df35c07fd2","Type":"ContainerStarted","Data":"e12f9beab56a624099d15f7468bf016154abcd495becb5b9ae3b89d60f49bad3"} Mar 20 14:54:57 crc kubenswrapper[4764]: E0320 14:54:57.232448 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.73243635 +0000 UTC m=+219.348625479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.284237 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" podStartSLOduration=177.284220262 podStartE2EDuration="2m57.284220262s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.281502346 +0000 UTC m=+218.897691485" watchObservedRunningTime="2026-03-20 14:54:57.284220262 +0000 UTC m=+218.900409391" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.290928 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qbtvq" event={"ID":"483ebf0b-1701-46df-8a4f-281688694851","Type":"ContainerStarted","Data":"6c3c341cc1d905fafeadee785b8494eb25669f757ec6869a4dd2945bd2b86d03"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.330514 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:57 crc kubenswrapper[4764]: E0320 14:54:57.330629 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.830608592 +0000 UTC m=+219.446797721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.330815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.332748 4764 generic.go:334] "Generic (PLEG): container finished" podID="83c08f82-bea9-451a-b1a5-a98b77e1502e" containerID="1bd16ae4c446cee3f28cf08388138a409f8fbf5ff0f2aacc144b9fad87b61573" exitCode=0 Mar 20 14:54:57 crc kubenswrapper[4764]: E0320 14:54:57.332792 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.832775859 +0000 UTC m=+219.448964988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.332838 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" event={"ID":"83c08f82-bea9-451a-b1a5-a98b77e1502e","Type":"ContainerDied","Data":"1bd16ae4c446cee3f28cf08388138a409f8fbf5ff0f2aacc144b9fad87b61573"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.367537 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qxxq7" podStartSLOduration=177.367519718 podStartE2EDuration="2m57.367519718s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.361825557 +0000 UTC m=+218.978014676" watchObservedRunningTime="2026-03-20 14:54:57.367519718 +0000 UTC m=+218.983708847" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.372266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-752qt" event={"ID":"fe82f329-50db-4717-aa9b-6245253449cf","Type":"ContainerStarted","Data":"86ae1ffd4b7b8077f87b668a8300f292e502dca00d76e9687c691b6a92e3c4d0"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.374811 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-752qt" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.375688 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-752qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.375736 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-752qt" podUID="fe82f329-50db-4717-aa9b-6245253449cf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.385260 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" event={"ID":"e1d1d43b-6e4a-404c-bcb0-48fded5252b7","Type":"ContainerStarted","Data":"3c61a7fd17b39867aaa1e05bc56d578fc4d1f899e0266a1878e885634586c1bd"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.419400 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl" event={"ID":"fe5af817-ef26-4ebf-a14b-bae0470f4fd8","Type":"ContainerStarted","Data":"a16da194980d52567de3666cc2a22dc01e33f8e67eb9b2d7fe87bbeac5e2f7bb"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.419438 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl" event={"ID":"fe5af817-ef26-4ebf-a14b-bae0470f4fd8","Type":"ContainerStarted","Data":"38510c676ba94cea1b84a6e55f0dd57c33322293233b34582b91df7a3726e145"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.426054 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jpz9j" podStartSLOduration=176.426038367 podStartE2EDuration="2m56.426038367s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.394690769 +0000 UTC m=+219.010879898" watchObservedRunningTime="2026-03-20 14:54:57.426038367 +0000 UTC m=+219.042227496" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.429814 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" event={"ID":"49509900-efe2-4bbb-b89e-5666cec1caf2","Type":"ContainerStarted","Data":"4e14704207cdaddd26895f693fbe2e7fe5ca00fb4dc2a4ffd292701eb0a38a5b"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.431420 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:57 crc kubenswrapper[4764]: E0320 14:54:57.431717 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:57.931703528 +0000 UTC m=+219.547892657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.432925 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" event={"ID":"0cd6640b-2f6c-4900-b850-bda7f5c9ae6c","Type":"ContainerStarted","Data":"8534a058c4d7dda8518f42bc047e91894d7353aa91b7f7aefcdc96b49187b9f7"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.440267 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmwp9" podStartSLOduration=176.44024609 podStartE2EDuration="2m56.44024609s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.421947083 +0000 UTC m=+219.038136212" watchObservedRunningTime="2026-03-20 14:54:57.44024609 +0000 UTC m=+219.056435219" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.460916 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" event={"ID":"667df1e6-264f-40c2-a45f-f50c1cf0b88a","Type":"ContainerStarted","Data":"66565e88ac44967a5a53219ca1fa553709e555a914873284a166a5d69a4e246d"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.499588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" event={"ID":"c5d60062-e5c3-4d3c-bae9-7c3272c16a17","Type":"ContainerStarted","Data":"16482ab926ed5406b28a92e730810a868a35eaba0fb4e937b4fbf606db6ead9f"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.510432 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl" podStartSLOduration=177.510416502 podStartE2EDuration="2m57.510416502s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.461555753 +0000 UTC m=+219.077744882" watchObservedRunningTime="2026-03-20 14:54:57.510416502 +0000 UTC m=+219.126605631" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.532572 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:57 crc kubenswrapper[4764]: E0320 14:54:57.536225 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:58.036211424 +0000 UTC m=+219.652400553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.540029 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-cmp9t" podStartSLOduration=176.540011338 podStartE2EDuration="2m56.540011338s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.514802796 +0000 UTC m=+219.130991925" watchObservedRunningTime="2026-03-20 14:54:57.540011338 +0000 UTC m=+219.156200467" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.560147 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" event={"ID":"0c6ef043-f571-4aff-90e8-a07752e9086c","Type":"ContainerStarted","Data":"d20945dc018dce2e8543a322d6a301da4fbbc2ce264c14a57815685dbec79d35"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.580102 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-752qt" podStartSLOduration=176.580088625 podStartE2EDuration="2m56.580088625s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.542287869 +0000 UTC m=+219.158476998" watchObservedRunningTime="2026-03-20 14:54:57.580088625 +0000 UTC m=+219.196277754" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.582754 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5z97v" event={"ID":"82463101-a3d9-4a1b-a180-aba0318fbeb4","Type":"ContainerStarted","Data":"ac9ed294f69a85d38f7b9c53419749819756c5e9de9a056349b71a50ade1dc37"} Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.588498 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.593848 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.609363 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6c58z" podStartSLOduration=176.609350211 podStartE2EDuration="2m56.609350211s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.608884914 +0000 UTC m=+219.225074063" watchObservedRunningTime="2026-03-20 14:54:57.609350211 +0000 UTC m=+219.225539340" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.609996 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qbtvq" podStartSLOduration=5.609990913 podStartE2EDuration="5.609990913s" podCreationTimestamp="2026-03-20 14:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.581256497 +0000 UTC m=+219.197445626" watchObservedRunningTime="2026-03-20 14:54:57.609990913 +0000 UTC m=+219.226180042" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.635105 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:57 crc kubenswrapper[4764]: E0320 14:54:57.636190 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:58.136170429 +0000 UTC m=+219.752359558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.708317 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-jxhgx" podStartSLOduration=176.708296249 podStartE2EDuration="2m56.708296249s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.657734311 +0000 UTC m=+219.273923440" watchObservedRunningTime="2026-03-20 14:54:57.708296249 +0000 UTC m=+219.324485458" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.740873 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:57 crc kubenswrapper[4764]: E0320 14:54:57.741199 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:58.241185282 +0000 UTC m=+219.857374421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.822609 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" podStartSLOduration=176.822583481 podStartE2EDuration="2m56.822583481s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.756042638 +0000 UTC m=+219.372231767" watchObservedRunningTime="2026-03-20 14:54:57.822583481 +0000 UTC m=+219.438772610" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.850059 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:57 crc kubenswrapper[4764]: E0320 14:54:57.850400 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:58.350370243 +0000 UTC m=+219.966559372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.852848 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" podStartSLOduration=177.852834962 podStartE2EDuration="2m57.852834962s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.85166254 +0000 UTC m=+219.467851669" watchObservedRunningTime="2026-03-20 14:54:57.852834962 +0000 UTC m=+219.469024091" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.853853 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-5z97v" podStartSLOduration=176.853847057 podStartE2EDuration="2m56.853847057s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.815978038 +0000 UTC m=+219.432167167" watchObservedRunningTime="2026-03-20 14:54:57.853847057 +0000 UTC m=+219.470036186" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.897633 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" podStartSLOduration=176.897619525 podStartE2EDuration="2m56.897619525s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:57.875003135 +0000 UTC m=+219.491192264" watchObservedRunningTime="2026-03-20 14:54:57.897619525 +0000 UTC m=+219.513808654" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.951298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:57 crc kubenswrapper[4764]: E0320 14:54:57.951637 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:58.451626305 +0000 UTC m=+220.067815434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.991479 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.997340 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:54:57 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:54:57 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:54:57 crc kubenswrapper[4764]: healthz check failed Mar 20 14:54:57 crc kubenswrapper[4764]: I0320 14:54:57.997409 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.053837 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:58 crc kubenswrapper[4764]: E0320 14:54:58.054178 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:58.554160321 +0000 UTC m=+220.170349450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.156009 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:58 crc kubenswrapper[4764]: E0320 14:54:58.156326 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:58.656314624 +0000 UTC m=+220.272503753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.221736 4764 ???:1] "http: TLS handshake error from 192.168.126.11:50964: no serving certificate available for the kubelet" Mar 20 14:54:58 crc kubenswrapper[4764]: E0320 14:54:58.257683 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:58.757658868 +0000 UTC m=+220.373847997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.259729 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.260057 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:58 crc kubenswrapper[4764]: E0320 14:54:58.260371 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:58.760356473 +0000 UTC m=+220.376545602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.327053 4764 ???:1] "http: TLS handshake error from 192.168.126.11:50976: no serving certificate available for the kubelet" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.361862 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:58 crc kubenswrapper[4764]: E0320 14:54:58.362144 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:58.862129893 +0000 UTC m=+220.478319022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.419434 4764 ???:1] "http: TLS handshake error from 192.168.126.11:50992: no serving certificate available for the kubelet" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.463360 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:58 crc kubenswrapper[4764]: E0320 14:54:58.463721 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:58.963703994 +0000 UTC m=+220.579893123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.546701 4764 ???:1] "http: TLS handshake error from 192.168.126.11:51004: no serving certificate available for the kubelet" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.564106 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:58 crc kubenswrapper[4764]: E0320 14:54:58.564282 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:59.06425493 +0000 UTC m=+220.680444059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.564438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:58 crc kubenswrapper[4764]: E0320 14:54:58.564713 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:59.064700016 +0000 UTC m=+220.680889135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.625921 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" event={"ID":"83c08f82-bea9-451a-b1a5-a98b77e1502e","Type":"ContainerStarted","Data":"44d2925080bd0b23acdf962ddd7f96af11796065341e6f4e530a595559e7eab9"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.627802 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" event={"ID":"83c08f82-bea9-451a-b1a5-a98b77e1502e","Type":"ContainerStarted","Data":"a31271b8278f7b91a0522a7eee401496ae23bd5d5b108cbd56ae5045bab7da02"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.636897 4764 ???:1] "http: TLS handshake error from 192.168.126.11:51014: no serving certificate available for the kubelet" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.666813 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:58 crc kubenswrapper[4764]: E0320 14:54:58.667125 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:59.167111118 +0000 UTC m=+220.783300247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.673185 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" podStartSLOduration=178.673171313 podStartE2EDuration="2m58.673171313s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:58.672921594 +0000 UTC m=+220.289110733" watchObservedRunningTime="2026-03-20 14:54:58.673171313 +0000 UTC m=+220.289360442" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.693175 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cs6rb" event={"ID":"f8fb19fb-3696-4328-898f-dcef4558a658","Type":"ContainerStarted","Data":"482b9074bdf6b7fe76e0d77adc5b28c7d318a1e96be6ce69d8580f994b1b6fb5"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.693219 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cs6rb" event={"ID":"f8fb19fb-3696-4328-898f-dcef4558a658","Type":"ContainerStarted","Data":"dbf7228fd38aeb99c17341d79130ef164cba903f275b863fbe269ba9317147ca"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.713792 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" event={"ID":"ac15f1c4-c565-46d8-af08-ba5f7736fe04","Type":"ContainerStarted","Data":"a558ef1619c5d0ad882bc14d08e3adad61f682f764d2f956d1be9a76ecb45d8b"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.713833 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" event={"ID":"ac15f1c4-c565-46d8-af08-ba5f7736fe04","Type":"ContainerStarted","Data":"11a39adb9c8f63156378804cc0f437340d3e32b53f9e442e70d1c11879900dd6"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.716230 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.718607 4764 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-vrpsq container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.718651 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" podUID="ac15f1c4-c565-46d8-af08-ba5f7736fe04" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.736188 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" event={"ID":"0c6ef043-f571-4aff-90e8-a07752e9086c","Type":"ContainerStarted","Data":"eec8f7d72d866d2cac3fab53feb6f36486caac329d598c93b7fe380a84630662"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.762569 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" event={"ID":"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9","Type":"ContainerStarted","Data":"2d8f646d90b1f76b6f80903a27c86477022e1b3b24cd0dcf4161fc5887dc9e5d"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.762819 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" event={"ID":"1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9","Type":"ContainerStarted","Data":"0163171e83947647712d693d1005c84be18245e565f872718e027155f2fd8d17"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.763450 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.768409 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.768523 4764 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-slnm5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.768564 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" podUID="1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Mar 20 14:54:58 crc kubenswrapper[4764]: E0320 14:54:58.769154 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:59.269143006 +0000 UTC m=+220.885332135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.794681 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" podStartSLOduration=177.794667289 podStartE2EDuration="2m57.794667289s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:58.792543754 +0000 UTC m=+220.408732883" watchObservedRunningTime="2026-03-20 14:54:58.794667289 +0000 UTC m=+220.410856418" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.795284 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" podStartSLOduration=177.795279941 podStartE2EDuration="2m57.795279941s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:58.750452306 +0000 UTC m=+220.366641435" watchObservedRunningTime="2026-03-20 14:54:58.795279941 +0000 UTC m=+220.411469060" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.797831 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" event={"ID":"ea5fe851-776a-4cc8-b365-dea09cc3467a","Type":"ContainerStarted","Data":"18f7f2475c104b148669d849102459ab2968ebc3fd76f87d8ebbb63e98128a96"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.797867 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" event={"ID":"ea5fe851-776a-4cc8-b365-dea09cc3467a","Type":"ContainerStarted","Data":"89cd3fba3e6165603631315cccf2390177d1567b4fd34cf742d58f5792221181"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.799533 4764 ???:1] "http: TLS handshake error from 192.168.126.11:51024: no serving certificate available for the kubelet" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.810771 4764 generic.go:334] "Generic (PLEG): container finished" podID="d06b4728-f677-451f-9b6a-23055e2dde6f" containerID="be1162eb23568853a827c458183deb089fe65349eedc6251e87a859eaadd8d2a" exitCode=0 Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.811322 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" event={"ID":"d06b4728-f677-451f-9b6a-23055e2dde6f","Type":"ContainerDied","Data":"be1162eb23568853a827c458183deb089fe65349eedc6251e87a859eaadd8d2a"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.811362 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" event={"ID":"d06b4728-f677-451f-9b6a-23055e2dde6f","Type":"ContainerStarted","Data":"64920bf11b1136204235a25346516ad0e7781296603b7b831fbf1751cb223a37"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.811395 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.817920 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q9j5z" podStartSLOduration=177.817908721 podStartE2EDuration="2m57.817908721s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:58.816022245 +0000 UTC m=+220.432211374" watchObservedRunningTime="2026-03-20 14:54:58.817908721 +0000 UTC m=+220.434097850" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.837697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" event={"ID":"03046ce0-bd24-4ad7-968e-8b9652b6bccd","Type":"ContainerStarted","Data":"c45fd136c3c0a8369d54930ba0912e0d188cd54a33439e3864615c87370519c8"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.837766 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" event={"ID":"03046ce0-bd24-4ad7-968e-8b9652b6bccd","Type":"ContainerStarted","Data":"0221b2d4da9e1b767f37a75c732f0f3db88e6346069f75d97e92524b0dc8e4b3"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.838702 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.854585 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" event={"ID":"7000b862-69fa-4708-b74a-2511f65c9569","Type":"ContainerStarted","Data":"eb9322c984c3c08b0fa1ba973df328937f33bc4766fc3475a47aa1e5ff51273f"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.856802 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.870314 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" event={"ID":"143a8092-b930-4ffb-8414-eda1d808fb8c","Type":"ContainerStarted","Data":"fe3d59b4dd5fb1167dcb24814c525c24bc646126092a919a5c0be4be171f0bca"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.870374 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" event={"ID":"143a8092-b930-4ffb-8414-eda1d808fb8c","Type":"ContainerStarted","Data":"fbb676f2ba016ddba4c16ca445139044816d2b411509ced4aa5aa17c4da13e01"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.871229 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.871357 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.873583 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:58 crc kubenswrapper[4764]: E0320 14:54:58.874012 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:59.373984854 +0000 UTC m=+220.990173983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.886054 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" podStartSLOduration=178.886040351 podStartE2EDuration="2m58.886040351s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:58.885219362 +0000 UTC m=+220.501408491" watchObservedRunningTime="2026-03-20 14:54:58.886040351 +0000 UTC m=+220.502229480" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.886458 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4m6cl" event={"ID":"fe5af817-ef26-4ebf-a14b-bae0470f4fd8","Type":"ContainerStarted","Data":"9bf402b9c7226e1a6e4c22f68fa9f4f7dcc7eb81e1d7a00c0e791d87266f8f3f"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.887024 4764 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hkbns container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.887056 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" podUID="143a8092-b930-4ffb-8414-eda1d808fb8c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.887299 4764 ???:1] "http: TLS handshake error from 192.168.126.11:51036: no serving certificate available for the kubelet" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.913697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pjh2l" event={"ID":"b2d2f76c-cee1-4eba-bf77-a08cc6de3b7b","Type":"ContainerStarted","Data":"59018d7c5764aa48516bcc6d6a5a5531376a561f2e2723bb88acfdd938aa9901"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.927405 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4m2jn" podStartSLOduration=177.927368012 podStartE2EDuration="2m57.927368012s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:58.924967798 +0000 UTC m=+220.541156927" watchObservedRunningTime="2026-03-20 14:54:58.927368012 +0000 UTC m=+220.543557141" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.939053 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" event={"ID":"b1c0c7a8-94b2-434d-9680-31ba9ddcc723","Type":"ContainerStarted","Data":"dac627f5d02a486235d163b6238cf5ee58d05877dcfb222a85eb8d1619064d7f"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.966180 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" event={"ID":"667df1e6-264f-40c2-a45f-f50c1cf0b88a","Type":"ContainerStarted","Data":"29315f7f480e836c979309eb1fb986357c60e12351d519c611379225a090a2f2"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.975218 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:58 crc kubenswrapper[4764]: E0320 14:54:58.976404 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:59.476369695 +0000 UTC m=+221.092558824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.984050 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" podStartSLOduration=177.984033957 podStartE2EDuration="2m57.984033957s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:58.978962117 +0000 UTC m=+220.595151236" watchObservedRunningTime="2026-03-20 14:54:58.984033957 +0000 UTC m=+220.600223086" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.986118 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" event={"ID":"1cc20998-b9bf-498c-85eb-037843ae0bc6","Type":"ContainerStarted","Data":"deb86af1b39915dd436efe039f0eb91cab69be35c472e7610442272e7a8de5d6"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.995602 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:54:58 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:54:58 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:54:58 crc kubenswrapper[4764]: healthz check failed Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.995645 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.999566 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9z78m" event={"ID":"c5d60062-e5c3-4d3c-bae9-7c3272c16a17","Type":"ContainerStarted","Data":"294fd0772baa54aadf8ea024b2163af61956500c2f31b408c79bbb31b2f9aca9"} Mar 20 14:54:58 crc kubenswrapper[4764]: I0320 14:54:58.999936 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" podStartSLOduration=177.999927238 podStartE2EDuration="2m57.999927238s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:58.998546649 +0000 UTC m=+220.614735778" watchObservedRunningTime="2026-03-20 14:54:58.999927238 +0000 UTC m=+220.616116367" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.005703 4764 ???:1] "http: TLS handshake error from 192.168.126.11:51042: no serving certificate available for the kubelet" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.010637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" event={"ID":"b2642181-781e-4192-9b05-406b0f97c44a","Type":"ContainerStarted","Data":"7c71569b4dec0a3a4416a1f990d5a3e47503a3783d30aeb6371563917a49989c"} Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.010692 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" event={"ID":"b2642181-781e-4192-9b05-406b0f97c44a","Type":"ContainerStarted","Data":"94d0a8638b24c98f0e0ce668467c4ffc0d6cd191b200695bd1e3f1c2c0e48652"} Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.012354 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" event={"ID":"9519ecde-7ffa-4aba-99a9-fc60b895767b","Type":"ContainerStarted","Data":"8be34cb3385dd09b25520776e1cae378211544fac40f16cdc75cacc3d1e4976c"} Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.029629 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" event={"ID":"1a20779e-1d3a-4c81-86c7-3248b50c8118","Type":"ContainerStarted","Data":"e05b03a6e9dfc5273b2d8b8f7795b765fe8ecc2344eed5840ae227d5faea20ed"} Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.051550 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wqjvc" podStartSLOduration=178.051523803 podStartE2EDuration="2m58.051523803s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:59.04890334 +0000 UTC m=+220.665092469" watchObservedRunningTime="2026-03-20 14:54:59.051523803 +0000 UTC m=+220.667712932" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.053747 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" event={"ID":"a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6","Type":"ContainerStarted","Data":"1dd45885945bd7859c9d463f1ef07a6e16ef9cbca8259c68903aa5da1d89893e"} Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.062470 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qbtvq" event={"ID":"483ebf0b-1701-46df-8a4f-281688694851","Type":"ContainerStarted","Data":"ec363ac4b54caaeec1ff620f2ca0cfbc58df5778e183673ae6140e4b50d49d95"} Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.071974 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vfjq2" event={"ID":"0e7aac7a-18b6-4f47-8f67-05df35c07fd2","Type":"ContainerStarted","Data":"69e09a3fde78c122c54d949454a02175b5e1dc1dcf877668e5c16569f7ec78ae"} Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.082067 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:59 crc kubenswrapper[4764]: E0320 14:54:59.083732 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:59.583717292 +0000 UTC m=+221.199906421 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.105195 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-zzvqq" podStartSLOduration=178.10518115 podStartE2EDuration="2m58.10518115s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:59.102703833 +0000 UTC m=+220.718892962" watchObservedRunningTime="2026-03-20 14:54:59.10518115 +0000 UTC m=+220.721370269" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.117311 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" event={"ID":"45d8bf31-ce5e-4f52-9394-1711d8b1f060","Type":"ContainerStarted","Data":"d423670f84ac8eacf1b12eff50247843f737a43c1c06031d2b99415beac1c6ea"} Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.154449 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tbxfl" podStartSLOduration=178.154433093 podStartE2EDuration="2m58.154433093s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:59.152186712 +0000 UTC m=+220.768375841" watchObservedRunningTime="2026-03-20 14:54:59.154433093 +0000 UTC m=+220.770622212" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.181257 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9r7l6" event={"ID":"ba8337ed-07c7-4cb3-91bd-899ce9da7a29","Type":"ContainerStarted","Data":"2b605d5d043718e44b72b2e3e88a042ab98549168d98a00c030af4d793cf0609"} Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.181529 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mvhpw" event={"ID":"1eb8df89-b3e4-4686-b0ed-b4ff1f840c66","Type":"ContainerStarted","Data":"f1dd177ff2520a9e786bea7ef714b5b74322849897b8f56ed049c040430eae7e"} Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.183231 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:59 crc kubenswrapper[4764]: E0320 14:54:59.184685 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:59.684673552 +0000 UTC m=+221.300862681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.212349 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-752qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.212411 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-752qt" podUID="fe82f329-50db-4717-aa9b-6245253449cf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.212557 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-47trd" event={"ID":"0cd6640b-2f6c-4900-b850-bda7f5c9ae6c","Type":"ContainerStarted","Data":"07fb3f07c422e2f4251a52e0f65690ccd422394219c0e643ecce683d26c46886"} Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.245704 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qxxq7" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.284233 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:59 crc kubenswrapper[4764]: E0320 14:54:59.285886 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:59.785872671 +0000 UTC m=+221.402061800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.380239 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.380554 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.391569 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:59 crc kubenswrapper[4764]: E0320 14:54:59.391855 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:54:59.891842499 +0000 UTC m=+221.508031628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.392193 4764 patch_prober.go:28] interesting pod/apiserver-76f77b778f-8gwvk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.392236 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" podUID="83c08f82-bea9-451a-b1a5-a98b77e1502e" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.409883 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-pjh2l" podStartSLOduration=178.409868576 podStartE2EDuration="2m58.409868576s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:59.259970235 +0000 UTC m=+220.876159364" watchObservedRunningTime="2026-03-20 14:54:59.409868576 +0000 UTC m=+221.026057705" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.491016 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" podStartSLOduration=178.490999145 podStartE2EDuration="2m58.490999145s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:59.41054859 +0000 UTC m=+221.026737719" watchObservedRunningTime="2026-03-20 14:54:59.490999145 +0000 UTC m=+221.107188274" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.492435 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mvhpw" podStartSLOduration=178.492430796 podStartE2EDuration="2m58.492430796s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:59.490392144 +0000 UTC m=+221.106581273" watchObservedRunningTime="2026-03-20 14:54:59.492430796 +0000 UTC m=+221.108619935" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.493369 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:59 crc kubenswrapper[4764]: E0320 14:54:59.493732 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:54:59.993721481 +0000 UTC m=+221.609910610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.526520 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.526571 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.536566 4764 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-x5nqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.536611 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" podUID="a8fd45b5-c11a-43c7-a1f2-fb4d4f8e5fd6" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.573111 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9r7l6" podStartSLOduration=7.573090838 podStartE2EDuration="7.573090838s" podCreationTimestamp="2026-03-20 14:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:59.56777424 +0000 UTC m=+221.183963389" watchObservedRunningTime="2026-03-20 14:54:59.573090838 +0000 UTC m=+221.189279967" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.593077 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.595277 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:59 crc kubenswrapper[4764]: E0320 14:54:59.595567 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:00.095556053 +0000 UTC m=+221.711745182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.691250 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vzfwv" podStartSLOduration=178.691234317 podStartE2EDuration="2m58.691234317s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:59.687860367 +0000 UTC m=+221.304049496" watchObservedRunningTime="2026-03-20 14:54:59.691234317 +0000 UTC m=+221.307423446" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.701999 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:59 crc kubenswrapper[4764]: E0320 14:54:59.702301 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:00.202284807 +0000 UTC m=+221.818473936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.730977 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-sppmr" podStartSLOduration=178.730960702 podStartE2EDuration="2m58.730960702s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:59.728015087 +0000 UTC m=+221.344204216" watchObservedRunningTime="2026-03-20 14:54:59.730960702 +0000 UTC m=+221.347149831" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.741792 4764 ???:1] "http: TLS handshake error from 192.168.126.11:51046: no serving certificate available for the kubelet" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.789764 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vfjq2" podStartSLOduration=178.789748221 podStartE2EDuration="2m58.789748221s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:59.787021244 +0000 UTC m=+221.403210373" watchObservedRunningTime="2026-03-20 14:54:59.789748221 +0000 UTC m=+221.405937350" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.803407 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:54:59 crc kubenswrapper[4764]: E0320 14:54:59.803683 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:00.303672523 +0000 UTC m=+221.919861652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.809914 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" podStartSLOduration=179.809897933 podStartE2EDuration="2m59.809897933s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:59.806739242 +0000 UTC m=+221.422928371" watchObservedRunningTime="2026-03-20 14:54:59.809897933 +0000 UTC m=+221.426087062" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.844948 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fln9z" podStartSLOduration=178.844929932 podStartE2EDuration="2m58.844929932s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:54:59.838702001 +0000 UTC m=+221.454891130" watchObservedRunningTime="2026-03-20 14:54:59.844929932 +0000 UTC m=+221.461119071" Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.904397 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:54:59 crc kubenswrapper[4764]: E0320 14:54:59.904786 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:00.404757248 +0000 UTC m=+222.020946377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.971773 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v94wn"] Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.989685 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:54:59 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:54:59 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:54:59 crc kubenswrapper[4764]: healthz check failed Mar 20 14:54:59 crc kubenswrapper[4764]: I0320 14:54:59.989755 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.006233 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.006564 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:00.506552667 +0000 UTC m=+222.122741796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.107970 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.108121 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:00.608095859 +0000 UTC m=+222.224284988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.108232 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.108536 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:00.608528464 +0000 UTC m=+222.224717593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.114407 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x"] Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.208658 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.209707 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:00.709679982 +0000 UTC m=+222.325869111 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.222103 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" event={"ID":"49509900-efe2-4bbb-b89e-5666cec1caf2","Type":"ContainerStarted","Data":"cd0bb36fe0ee1faf405e80d1fe6fcb8f25e667e6ae32f08eeecd12d95b015109"} Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.224114 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vfjq2" event={"ID":"0e7aac7a-18b6-4f47-8f67-05df35c07fd2","Type":"ContainerStarted","Data":"1fcc8877e806d34f43cba7ab49bbbfe9e11f106ed82d3527c4d1aa9a224689fc"} Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.230289 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mvhpw" event={"ID":"1eb8df89-b3e4-4686-b0ed-b4ff1f840c66","Type":"ContainerStarted","Data":"10d1eccaeb5eaca3397b9c46b98d83907c5466961ad205a3a17b7344bb609a25"} Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.261373 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cs6rb" event={"ID":"f8fb19fb-3696-4328-898f-dcef4558a658","Type":"ContainerStarted","Data":"4aa738e45516b93db6a83a8ca6349139f21a41dc96c11192209ddc0f3e37bab2"} Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.262265 4764 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hkbns container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.262312 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" podUID="143a8092-b930-4ffb-8414-eda1d808fb8c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.263311 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" podUID="3c51fd09-c129-48bf-9bf8-2d455b230386" containerName="route-controller-manager" containerID="cri-o://a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f" gracePeriod=30 Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.263356 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-cs6rb" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.289627 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cs6rb" podStartSLOduration=8.289610278 podStartE2EDuration="8.289610278s" podCreationTimestamp="2026-03-20 14:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:55:00.288892192 +0000 UTC m=+221.905081321" watchObservedRunningTime="2026-03-20 14:55:00.289610278 +0000 UTC m=+221.905799407" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.307762 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vrpsq" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.309952 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.321165 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:00.821143743 +0000 UTC m=+222.437332872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.409797 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c51fd09_c129_48bf_9bf8_2d455b230386.slice/crio-a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f.scope\": RecentStats: unable to find data in memory cache]" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.412030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.412201 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:00.912182713 +0000 UTC m=+222.528371842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.412434 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.417114 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:00.917098397 +0000 UTC m=+222.533287516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.513587 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.513675 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:01.013660101 +0000 UTC m=+222.629849230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.513922 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.514201 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:01.014194141 +0000 UTC m=+222.630383270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.614833 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.614997 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:01.114972845 +0000 UTC m=+222.731161974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.615467 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.615765 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:01.115752613 +0000 UTC m=+222.731941742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.717989 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.718237 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:01.218224236 +0000 UTC m=+222.834413355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.752187 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cmxv2"] Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.753571 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.767947 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.799221 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cmxv2"] Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.820194 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.820255 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e76e77-4199-4fdd-b755-10cab62e1370-catalog-content\") pod \"community-operators-cmxv2\" (UID: \"67e76e77-4199-4fdd-b755-10cab62e1370\") " pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.820292 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb8r2\" (UniqueName: \"kubernetes.io/projected/67e76e77-4199-4fdd-b755-10cab62e1370-kube-api-access-pb8r2\") pod \"community-operators-cmxv2\" (UID: \"67e76e77-4199-4fdd-b755-10cab62e1370\") " pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.820327 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e76e77-4199-4fdd-b755-10cab62e1370-utilities\") pod \"community-operators-cmxv2\" (UID: \"67e76e77-4199-4fdd-b755-10cab62e1370\") " pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.820609 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:01.320597617 +0000 UTC m=+222.936786746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.923143 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.923368 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e76e77-4199-4fdd-b755-10cab62e1370-catalog-content\") pod \"community-operators-cmxv2\" (UID: \"67e76e77-4199-4fdd-b755-10cab62e1370\") " pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.923422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb8r2\" (UniqueName: \"kubernetes.io/projected/67e76e77-4199-4fdd-b755-10cab62e1370-kube-api-access-pb8r2\") pod \"community-operators-cmxv2\" (UID: \"67e76e77-4199-4fdd-b755-10cab62e1370\") " pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.923458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e76e77-4199-4fdd-b755-10cab62e1370-utilities\") pod \"community-operators-cmxv2\" (UID: \"67e76e77-4199-4fdd-b755-10cab62e1370\") " pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.923819 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e76e77-4199-4fdd-b755-10cab62e1370-utilities\") pod \"community-operators-cmxv2\" (UID: \"67e76e77-4199-4fdd-b755-10cab62e1370\") " pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:00 crc kubenswrapper[4764]: E0320 14:55:00.923883 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:01.423868369 +0000 UTC m=+223.040057498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.924066 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e76e77-4199-4fdd-b755-10cab62e1370-catalog-content\") pod \"community-operators-cmxv2\" (UID: \"67e76e77-4199-4fdd-b755-10cab62e1370\") " pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.953743 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb8r2\" (UniqueName: \"kubernetes.io/projected/67e76e77-4199-4fdd-b755-10cab62e1370-kube-api-access-pb8r2\") pod \"community-operators-cmxv2\" (UID: \"67e76e77-4199-4fdd-b755-10cab62e1370\") " pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.982549 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f7m8j"] Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.983659 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.991006 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:55:00 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:55:00 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:55:00 crc kubenswrapper[4764]: healthz check failed Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.991051 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:00 crc kubenswrapper[4764]: I0320 14:55:00.991493 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.008251 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7m8j"] Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.018465 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.030021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.030067 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/548554c3-21d2-4406-a509-e80303628f56-catalog-content\") pod \"certified-operators-f7m8j\" (UID: \"548554c3-21d2-4406-a509-e80303628f56\") " pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.030117 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n76fb\" (UniqueName: \"kubernetes.io/projected/548554c3-21d2-4406-a509-e80303628f56-kube-api-access-n76fb\") pod \"certified-operators-f7m8j\" (UID: \"548554c3-21d2-4406-a509-e80303628f56\") " pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.030137 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/548554c3-21d2-4406-a509-e80303628f56-utilities\") pod \"certified-operators-f7m8j\" (UID: \"548554c3-21d2-4406-a509-e80303628f56\") " pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.030440 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:01.530429038 +0000 UTC m=+223.146618167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.068714 4764 ???:1] "http: TLS handshake error from 192.168.126.11:51050: no serving certificate available for the kubelet" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.117163 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.135045 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c51fd09-c129-48bf-9bf8-2d455b230386-serving-cert\") pod \"3c51fd09-c129-48bf-9bf8-2d455b230386\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.135102 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c51fd09-c129-48bf-9bf8-2d455b230386-config\") pod \"3c51fd09-c129-48bf-9bf8-2d455b230386\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.135126 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c51fd09-c129-48bf-9bf8-2d455b230386-client-ca\") pod \"3c51fd09-c129-48bf-9bf8-2d455b230386\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.135272 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.135344 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdstj\" (UniqueName: \"kubernetes.io/projected/3c51fd09-c129-48bf-9bf8-2d455b230386-kube-api-access-kdstj\") pod \"3c51fd09-c129-48bf-9bf8-2d455b230386\" (UID: \"3c51fd09-c129-48bf-9bf8-2d455b230386\") " Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.135483 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n76fb\" (UniqueName: \"kubernetes.io/projected/548554c3-21d2-4406-a509-e80303628f56-kube-api-access-n76fb\") pod \"certified-operators-f7m8j\" (UID: \"548554c3-21d2-4406-a509-e80303628f56\") " pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.135515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/548554c3-21d2-4406-a509-e80303628f56-utilities\") pod \"certified-operators-f7m8j\" (UID: \"548554c3-21d2-4406-a509-e80303628f56\") " pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.135579 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/548554c3-21d2-4406-a509-e80303628f56-catalog-content\") pod \"certified-operators-f7m8j\" (UID: \"548554c3-21d2-4406-a509-e80303628f56\") " pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.135964 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/548554c3-21d2-4406-a509-e80303628f56-catalog-content\") pod \"certified-operators-f7m8j\" (UID: \"548554c3-21d2-4406-a509-e80303628f56\") " pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.136760 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:01.636726137 +0000 UTC m=+223.252915266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.137310 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c51fd09-c129-48bf-9bf8-2d455b230386-config" (OuterVolumeSpecName: "config") pod "3c51fd09-c129-48bf-9bf8-2d455b230386" (UID: "3c51fd09-c129-48bf-9bf8-2d455b230386"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.137565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c51fd09-c129-48bf-9bf8-2d455b230386-client-ca" (OuterVolumeSpecName: "client-ca") pod "3c51fd09-c129-48bf-9bf8-2d455b230386" (UID: "3c51fd09-c129-48bf-9bf8-2d455b230386"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.139746 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/548554c3-21d2-4406-a509-e80303628f56-utilities\") pod \"certified-operators-f7m8j\" (UID: \"548554c3-21d2-4406-a509-e80303628f56\") " pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.143548 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c51fd09-c129-48bf-9bf8-2d455b230386-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3c51fd09-c129-48bf-9bf8-2d455b230386" (UID: "3c51fd09-c129-48bf-9bf8-2d455b230386"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.144430 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c51fd09-c129-48bf-9bf8-2d455b230386-kube-api-access-kdstj" (OuterVolumeSpecName: "kube-api-access-kdstj") pod "3c51fd09-c129-48bf-9bf8-2d455b230386" (UID: "3c51fd09-c129-48bf-9bf8-2d455b230386"). InnerVolumeSpecName "kube-api-access-kdstj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.158641 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hvnpx"] Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.158825 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c51fd09-c129-48bf-9bf8-2d455b230386" containerName="route-controller-manager" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.158838 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c51fd09-c129-48bf-9bf8-2d455b230386" containerName="route-controller-manager" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.158931 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c51fd09-c129-48bf-9bf8-2d455b230386" containerName="route-controller-manager" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.159618 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.168419 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n76fb\" (UniqueName: \"kubernetes.io/projected/548554c3-21d2-4406-a509-e80303628f56-kube-api-access-n76fb\") pod \"certified-operators-f7m8j\" (UID: \"548554c3-21d2-4406-a509-e80303628f56\") " pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.177390 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvnpx"] Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.241662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6990bd44-c839-4dcd-bb4a-8d9da96bf644-catalog-content\") pod \"community-operators-hvnpx\" (UID: \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\") " pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.241754 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjz2f\" (UniqueName: \"kubernetes.io/projected/6990bd44-c839-4dcd-bb4a-8d9da96bf644-kube-api-access-tjz2f\") pod \"community-operators-hvnpx\" (UID: \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\") " pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.241832 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6990bd44-c839-4dcd-bb4a-8d9da96bf644-utilities\") pod \"community-operators-hvnpx\" (UID: \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\") " pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.242008 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.242063 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdstj\" (UniqueName: \"kubernetes.io/projected/3c51fd09-c129-48bf-9bf8-2d455b230386-kube-api-access-kdstj\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.242075 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c51fd09-c129-48bf-9bf8-2d455b230386-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.242084 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c51fd09-c129-48bf-9bf8-2d455b230386-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.242093 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c51fd09-c129-48bf-9bf8-2d455b230386-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.242340 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:01.742322891 +0000 UTC m=+223.358512020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.261965 4764 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-slnm5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.262029 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" podUID="1123b16d-7f8c-4cd5-aaf8-aceb528d2fd9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.278976 4764 generic.go:334] "Generic (PLEG): container finished" podID="3c51fd09-c129-48bf-9bf8-2d455b230386" containerID="a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f" exitCode=0 Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.279044 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" event={"ID":"3c51fd09-c129-48bf-9bf8-2d455b230386","Type":"ContainerDied","Data":"a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f"} Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.279063 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.279071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x" event={"ID":"3c51fd09-c129-48bf-9bf8-2d455b230386","Type":"ContainerDied","Data":"36c0e4b8e348637947c47c961458e90adbf130ba7ed78ab6916acc4d073ede96"} Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.279083 4764 scope.go:117] "RemoveContainer" containerID="a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.295190 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x"] Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.297069 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-scc4x"] Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.299442 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" podUID="fb3e75f0-54ce-4565-aea0-7c4e7abec3ca" containerName="controller-manager" containerID="cri-o://5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647" gracePeriod=30 Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.299702 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" event={"ID":"49509900-efe2-4bbb-b89e-5666cec1caf2","Type":"ContainerStarted","Data":"d293de3449ffa1484881c902aafd80d2eabcb4b2edbcc86fb079fa5d2e4a9a79"} Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.301136 4764 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hkbns container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.301222 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" podUID="143a8092-b930-4ffb-8414-eda1d808fb8c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.320763 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.342779 4764 scope.go:117] "RemoveContainer" containerID="a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.343321 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.343499 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6990bd44-c839-4dcd-bb4a-8d9da96bf644-catalog-content\") pod \"community-operators-hvnpx\" (UID: \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\") " pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.343520 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjz2f\" (UniqueName: \"kubernetes.io/projected/6990bd44-c839-4dcd-bb4a-8d9da96bf644-kube-api-access-tjz2f\") pod \"community-operators-hvnpx\" (UID: \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\") " pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.343546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6990bd44-c839-4dcd-bb4a-8d9da96bf644-utilities\") pod \"community-operators-hvnpx\" (UID: \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\") " pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.343889 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6990bd44-c839-4dcd-bb4a-8d9da96bf644-utilities\") pod \"community-operators-hvnpx\" (UID: \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\") " pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.343949 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:01.843935695 +0000 UTC m=+223.460124824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.344153 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6990bd44-c839-4dcd-bb4a-8d9da96bf644-catalog-content\") pod \"community-operators-hvnpx\" (UID: \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\") " pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.345197 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f\": container with ID starting with a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f not found: ID does not exist" containerID="a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.345229 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f"} err="failed to get container status \"a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f\": rpc error: code = NotFound desc = could not find container \"a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f\": container with ID starting with a30f2d0961a3a49db3b27d26f252bcaa31534aeba17e16d7feb10fdb4382e35f not found: ID does not exist" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.350970 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nphvv"] Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.351986 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.370232 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nphvv"] Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.378856 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjz2f\" (UniqueName: \"kubernetes.io/projected/6990bd44-c839-4dcd-bb4a-8d9da96bf644-kube-api-access-tjz2f\") pod \"community-operators-hvnpx\" (UID: \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\") " pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.446416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.446498 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmzg8\" (UniqueName: \"kubernetes.io/projected/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-kube-api-access-hmzg8\") pod \"certified-operators-nphvv\" (UID: \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\") " pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.446663 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-catalog-content\") pod \"certified-operators-nphvv\" (UID: \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\") " pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.446793 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-utilities\") pod \"certified-operators-nphvv\" (UID: \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\") " pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.447860 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:01.947846959 +0000 UTC m=+223.564036078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.487791 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.536743 4764 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.545567 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-slnm5" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.547995 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.548218 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:02.048199729 +0000 UTC m=+223.664388858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.548260 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-catalog-content\") pod \"certified-operators-nphvv\" (UID: \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\") " pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.548330 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-utilities\") pod \"certified-operators-nphvv\" (UID: \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\") " pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.548362 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.548405 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmzg8\" (UniqueName: \"kubernetes.io/projected/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-kube-api-access-hmzg8\") pod \"certified-operators-nphvv\" (UID: \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\") " pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.548922 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:02.048914353 +0000 UTC m=+223.665103482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.548955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-utilities\") pod \"certified-operators-nphvv\" (UID: \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\") " pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.549163 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-catalog-content\") pod \"certified-operators-nphvv\" (UID: \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\") " pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.576627 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmzg8\" (UniqueName: \"kubernetes.io/projected/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-kube-api-access-hmzg8\") pod \"certified-operators-nphvv\" (UID: \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\") " pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.650820 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.651178 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:02.15116363 +0000 UTC m=+223.767352759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.729004 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cmxv2"] Mar 20 14:55:01 crc kubenswrapper[4764]: W0320 14:55:01.745011 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67e76e77_4199_4fdd_b755_10cab62e1370.slice/crio-e9ced8f9bfa238ed9cd19700969c06fa94fdd9f4877a29d594b3a7f8e46df451 WatchSource:0}: Error finding container e9ced8f9bfa238ed9cd19700969c06fa94fdd9f4877a29d594b3a7f8e46df451: Status 404 returned error can't find the container with id e9ced8f9bfa238ed9cd19700969c06fa94fdd9f4877a29d594b3a7f8e46df451 Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.753264 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.753809 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:02.253796819 +0000 UTC m=+223.869985948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.807928 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.809699 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7m8j"] Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.829961 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:55:01 crc kubenswrapper[4764]: W0320 14:55:01.836556 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod548554c3_21d2_4406_a509_e80303628f56.slice/crio-8b566261c1565e366c658c76912f950d7dba469f4d16b3c42932121ace0d9b08 WatchSource:0}: Error finding container 8b566261c1565e366c658c76912f950d7dba469f4d16b3c42932121ace0d9b08: Status 404 returned error can't find the container with id 8b566261c1565e366c658c76912f950d7dba469f4d16b3c42932121ace0d9b08 Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.855521 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.857231 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:02.357211656 +0000 UTC m=+223.973400785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.951603 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvnpx"] Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.960667 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-client-ca\") pod \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.961036 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-proxy-ca-bundles\") pod \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.961191 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26nb8\" (UniqueName: \"kubernetes.io/projected/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-kube-api-access-26nb8\") pod \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.961232 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-serving-cert\") pod \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.961254 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-config\") pod \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\" (UID: \"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca\") " Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.961460 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.961688 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-client-ca" (OuterVolumeSpecName: "client-ca") pod "fb3e75f0-54ce-4565-aea0-7c4e7abec3ca" (UID: "fb3e75f0-54ce-4565-aea0-7c4e7abec3ca"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:01 crc kubenswrapper[4764]: E0320 14:55:01.961777 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:02.461762444 +0000 UTC m=+224.077951583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.962545 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-config" (OuterVolumeSpecName: "config") pod "fb3e75f0-54ce-4565-aea0-7c4e7abec3ca" (UID: "fb3e75f0-54ce-4565-aea0-7c4e7abec3ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.962563 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fb3e75f0-54ce-4565-aea0-7c4e7abec3ca" (UID: "fb3e75f0-54ce-4565-aea0-7c4e7abec3ca"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.966979 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fb3e75f0-54ce-4565-aea0-7c4e7abec3ca" (UID: "fb3e75f0-54ce-4565-aea0-7c4e7abec3ca"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.967850 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-kube-api-access-26nb8" (OuterVolumeSpecName: "kube-api-access-26nb8") pod "fb3e75f0-54ce-4565-aea0-7c4e7abec3ca" (UID: "fb3e75f0-54ce-4565-aea0-7c4e7abec3ca"). InnerVolumeSpecName "kube-api-access-26nb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.993862 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:55:01 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:55:01 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:55:01 crc kubenswrapper[4764]: healthz check failed Mar 20 14:55:01 crc kubenswrapper[4764]: I0320 14:55:01.993905 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:02 crc kubenswrapper[4764]: W0320 14:55:02.045460 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6990bd44_c839_4dcd_bb4a_8d9da96bf644.slice/crio-e9a2c4e839a18275c09bc7a1f7bdbe7d54025e9a91aada75523699deaf717e4d WatchSource:0}: Error finding container e9a2c4e839a18275c09bc7a1f7bdbe7d54025e9a91aada75523699deaf717e4d: Status 404 returned error can't find the container with id e9a2c4e839a18275c09bc7a1f7bdbe7d54025e9a91aada75523699deaf717e4d Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.059893 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nphvv"] Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.063684 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:02 crc kubenswrapper[4764]: E0320 14:55:02.063851 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:02.563836973 +0000 UTC m=+224.180026102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.063979 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.064085 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.064100 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.064110 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26nb8\" (UniqueName: \"kubernetes.io/projected/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-kube-api-access-26nb8\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.064120 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.064129 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:02 crc kubenswrapper[4764]: E0320 14:55:02.064330 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:02.564323811 +0000 UTC m=+224.180512930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:02 crc kubenswrapper[4764]: W0320 14:55:02.077561 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod429a82b0_5f61_4d42_a0d2_2fcb566f0bcc.slice/crio-c70e8561aa6eb92ac1a8dd26b3d94394e3de043e4cdbc051523d739ecf33788e WatchSource:0}: Error finding container c70e8561aa6eb92ac1a8dd26b3d94394e3de043e4cdbc051523d739ecf33788e: Status 404 returned error can't find the container with id c70e8561aa6eb92ac1a8dd26b3d94394e3de043e4cdbc051523d739ecf33788e Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.164862 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:02 crc kubenswrapper[4764]: E0320 14:55:02.165194 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 14:55:02.665180358 +0000 UTC m=+224.281369487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.267405 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:02 crc kubenswrapper[4764]: E0320 14:55:02.267746 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 14:55:02.767733695 +0000 UTC m=+224.383922824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8mljc" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.312077 4764 generic.go:334] "Generic (PLEG): container finished" podID="67e76e77-4199-4fdd-b755-10cab62e1370" containerID="75d921f0b333670dd3d0ece1a49bc4b52edebdb3da2335444db898d2f2e8b1d5" exitCode=0 Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.312320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmxv2" event={"ID":"67e76e77-4199-4fdd-b755-10cab62e1370","Type":"ContainerDied","Data":"75d921f0b333670dd3d0ece1a49bc4b52edebdb3da2335444db898d2f2e8b1d5"} Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.312394 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmxv2" event={"ID":"67e76e77-4199-4fdd-b755-10cab62e1370","Type":"ContainerStarted","Data":"e9ced8f9bfa238ed9cd19700969c06fa94fdd9f4877a29d594b3a7f8e46df451"} Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.317487 4764 generic.go:334] "Generic (PLEG): container finished" podID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" containerID="7a2de8567b48c0644dc6fc7366002730eec8bd3ff6d8903af86d090757cc38ea" exitCode=0 Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.317539 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nphvv" event={"ID":"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc","Type":"ContainerDied","Data":"7a2de8567b48c0644dc6fc7366002730eec8bd3ff6d8903af86d090757cc38ea"} Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.317564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nphvv" event={"ID":"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc","Type":"ContainerStarted","Data":"c70e8561aa6eb92ac1a8dd26b3d94394e3de043e4cdbc051523d739ecf33788e"} Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.323618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" event={"ID":"49509900-efe2-4bbb-b89e-5666cec1caf2","Type":"ContainerStarted","Data":"f5ff9d2edc3dcc4e5e54e6565b176de4c91d4efa9e0ffd90a07d4910727e19ff"} Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.323655 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" event={"ID":"49509900-efe2-4bbb-b89e-5666cec1caf2","Type":"ContainerStarted","Data":"50a522a09f4c525db312081decc972b182de2b032bdc9f86d41d6692f19acb9a"} Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.325358 4764 generic.go:334] "Generic (PLEG): container finished" podID="fb3e75f0-54ce-4565-aea0-7c4e7abec3ca" containerID="5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647" exitCode=0 Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.325419 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" event={"ID":"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca","Type":"ContainerDied","Data":"5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647"} Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.325446 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" event={"ID":"fb3e75f0-54ce-4565-aea0-7c4e7abec3ca","Type":"ContainerDied","Data":"6151592aef9500d9865b21c23e99c3880253fd3bd765a4de279dfa47b2816edb"} Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.325461 4764 scope.go:117] "RemoveContainer" containerID="5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.325533 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v94wn" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.335599 4764 generic.go:334] "Generic (PLEG): container finished" podID="548554c3-21d2-4406-a509-e80303628f56" containerID="b85cad5ad5cd32600a6d190dfb2ce71e7bd6053612f19e30699b8d1c9e344ada" exitCode=0 Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.335660 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7m8j" event={"ID":"548554c3-21d2-4406-a509-e80303628f56","Type":"ContainerDied","Data":"b85cad5ad5cd32600a6d190dfb2ce71e7bd6053612f19e30699b8d1c9e344ada"} Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.335687 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7m8j" event={"ID":"548554c3-21d2-4406-a509-e80303628f56","Type":"ContainerStarted","Data":"8b566261c1565e366c658c76912f950d7dba469f4d16b3c42932121ace0d9b08"} Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.338605 4764 generic.go:334] "Generic (PLEG): container finished" podID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" containerID="aceab4a8cc7db52383ba454867ae703a95545338583456707bb4eb4b3dad235b" exitCode=0 Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.338647 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvnpx" event={"ID":"6990bd44-c839-4dcd-bb4a-8d9da96bf644","Type":"ContainerDied","Data":"aceab4a8cc7db52383ba454867ae703a95545338583456707bb4eb4b3dad235b"} Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.338663 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvnpx" event={"ID":"6990bd44-c839-4dcd-bb4a-8d9da96bf644","Type":"ContainerStarted","Data":"e9a2c4e839a18275c09bc7a1f7bdbe7d54025e9a91aada75523699deaf717e4d"} Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.347771 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-x8hb8" podStartSLOduration=10.347756745 podStartE2EDuration="10.347756745s" podCreationTimestamp="2026-03-20 14:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:55:02.344410216 +0000 UTC m=+223.960599335" watchObservedRunningTime="2026-03-20 14:55:02.347756745 +0000 UTC m=+223.963945874" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.360211 4764 scope.go:117] "RemoveContainer" containerID="5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.361410 4764 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T14:55:01.536767784Z","Handler":null,"Name":""} Mar 20 14:55:02 crc kubenswrapper[4764]: E0320 14:55:02.363172 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647\": container with ID starting with 5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647 not found: ID does not exist" containerID="5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.363213 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647"} err="failed to get container status \"5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647\": rpc error: code = NotFound desc = could not find container \"5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647\": container with ID starting with 5b083b711c5e97b2b1a142f4b8e0f7de8800893d22aab8ac808238b54773f647 not found: ID does not exist" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.364779 4764 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.364804 4764 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.368138 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.380788 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.414642 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v94wn"] Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.417673 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v94wn"] Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.471885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.474333 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.474406 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.494939 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8mljc\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.565336 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw"] Mar 20 14:55:02 crc kubenswrapper[4764]: E0320 14:55:02.565560 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3e75f0-54ce-4565-aea0-7c4e7abec3ca" containerName="controller-manager" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.565572 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3e75f0-54ce-4565-aea0-7c4e7abec3ca" containerName="controller-manager" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.565677 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3e75f0-54ce-4565-aea0-7c4e7abec3ca" containerName="controller-manager" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.566034 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.571530 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.571670 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.571830 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.571907 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.572131 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.572270 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.585286 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw"] Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.673732 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-client-ca\") pod \"route-controller-manager-6dc864f5b4-n2rqw\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.673785 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfcfx\" (UniqueName: \"kubernetes.io/projected/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-kube-api-access-pfcfx\") pod \"route-controller-manager-6dc864f5b4-n2rqw\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.673827 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-config\") pod \"route-controller-manager-6dc864f5b4-n2rqw\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.673948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-serving-cert\") pod \"route-controller-manager-6dc864f5b4-n2rqw\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.775045 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfcfx\" (UniqueName: \"kubernetes.io/projected/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-kube-api-access-pfcfx\") pod \"route-controller-manager-6dc864f5b4-n2rqw\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.775103 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-config\") pod \"route-controller-manager-6dc864f5b4-n2rqw\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.775147 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-serving-cert\") pod \"route-controller-manager-6dc864f5b4-n2rqw\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.775187 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-client-ca\") pod \"route-controller-manager-6dc864f5b4-n2rqw\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.776186 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-client-ca\") pod \"route-controller-manager-6dc864f5b4-n2rqw\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.776630 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-config\") pod \"route-controller-manager-6dc864f5b4-n2rqw\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.794246 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-serving-cert\") pod \"route-controller-manager-6dc864f5b4-n2rqw\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.796816 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.805110 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfcfx\" (UniqueName: \"kubernetes.io/projected/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-kube-api-access-pfcfx\") pod \"route-controller-manager-6dc864f5b4-n2rqw\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.893340 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.944799 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-44fr6"] Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.946229 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.949479 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.956510 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-44fr6"] Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.988093 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:55:02 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:55:02 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:55:02 crc kubenswrapper[4764]: healthz check failed Mar 20 14:55:02 crc kubenswrapper[4764]: I0320 14:55:02.988147 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.079286 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/830768c7-49e2-4ed5-af8e-3762dc00534e-utilities\") pod \"redhat-marketplace-44fr6\" (UID: \"830768c7-49e2-4ed5-af8e-3762dc00534e\") " pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.079326 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/830768c7-49e2-4ed5-af8e-3762dc00534e-catalog-content\") pod \"redhat-marketplace-44fr6\" (UID: \"830768c7-49e2-4ed5-af8e-3762dc00534e\") " pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.079484 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj459\" (UniqueName: \"kubernetes.io/projected/830768c7-49e2-4ed5-af8e-3762dc00534e-kube-api-access-dj459\") pod \"redhat-marketplace-44fr6\" (UID: \"830768c7-49e2-4ed5-af8e-3762dc00534e\") " pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.116147 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw"] Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.133599 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c51fd09-c129-48bf-9bf8-2d455b230386" path="/var/lib/kubelet/pods/3c51fd09-c129-48bf-9bf8-2d455b230386/volumes" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.134992 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.135552 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb3e75f0-54ce-4565-aea0-7c4e7abec3ca" path="/var/lib/kubelet/pods/fb3e75f0-54ce-4565-aea0-7c4e7abec3ca/volumes" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.180985 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj459\" (UniqueName: \"kubernetes.io/projected/830768c7-49e2-4ed5-af8e-3762dc00534e-kube-api-access-dj459\") pod \"redhat-marketplace-44fr6\" (UID: \"830768c7-49e2-4ed5-af8e-3762dc00534e\") " pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.181046 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/830768c7-49e2-4ed5-af8e-3762dc00534e-utilities\") pod \"redhat-marketplace-44fr6\" (UID: \"830768c7-49e2-4ed5-af8e-3762dc00534e\") " pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.181072 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/830768c7-49e2-4ed5-af8e-3762dc00534e-catalog-content\") pod \"redhat-marketplace-44fr6\" (UID: \"830768c7-49e2-4ed5-af8e-3762dc00534e\") " pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.181579 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/830768c7-49e2-4ed5-af8e-3762dc00534e-catalog-content\") pod \"redhat-marketplace-44fr6\" (UID: \"830768c7-49e2-4ed5-af8e-3762dc00534e\") " pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.182130 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/830768c7-49e2-4ed5-af8e-3762dc00534e-utilities\") pod \"redhat-marketplace-44fr6\" (UID: \"830768c7-49e2-4ed5-af8e-3762dc00534e\") " pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.209195 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj459\" (UniqueName: \"kubernetes.io/projected/830768c7-49e2-4ed5-af8e-3762dc00534e-kube-api-access-dj459\") pod \"redhat-marketplace-44fr6\" (UID: \"830768c7-49e2-4ed5-af8e-3762dc00534e\") " pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.233645 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8mljc"] Mar 20 14:55:03 crc kubenswrapper[4764]: W0320 14:55:03.238889 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd07d531_e6b9_4b58_9e6c_8012b3a473eb.slice/crio-1a7975ca39237d5d9eb2d3836d4599297e7739292f688daa7da83ca9b896c8df WatchSource:0}: Error finding container 1a7975ca39237d5d9eb2d3836d4599297e7739292f688daa7da83ca9b896c8df: Status 404 returned error can't find the container with id 1a7975ca39237d5d9eb2d3836d4599297e7739292f688daa7da83ca9b896c8df Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.316292 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.341255 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tmdd7"] Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.342302 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.346780 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" event={"ID":"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4","Type":"ContainerStarted","Data":"9dee3f4deabe1fac787267cd725a2a59ac712a81f2b1fcb3c241ce52684b79e7"} Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.346815 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" event={"ID":"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4","Type":"ContainerStarted","Data":"6203773ddd96936bb832d1aef7e2bfc7cf5f286cda9c8374efe31e0750fd4e72"} Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.347621 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.359346 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tmdd7"] Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.362162 4764 generic.go:334] "Generic (PLEG): container finished" podID="1a20779e-1d3a-4c81-86c7-3248b50c8118" containerID="e05b03a6e9dfc5273b2d8b8f7795b765fe8ecc2344eed5840ae227d5faea20ed" exitCode=0 Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.362240 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" event={"ID":"1a20779e-1d3a-4c81-86c7-3248b50c8118","Type":"ContainerDied","Data":"e05b03a6e9dfc5273b2d8b8f7795b765fe8ecc2344eed5840ae227d5faea20ed"} Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.385491 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" podStartSLOduration=3.385476274 podStartE2EDuration="3.385476274s" podCreationTimestamp="2026-03-20 14:55:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:55:03.380903462 +0000 UTC m=+224.997092581" watchObservedRunningTime="2026-03-20 14:55:03.385476274 +0000 UTC m=+225.001665403" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.387867 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" event={"ID":"fd07d531-e6b9-4b58-9e6c-8012b3a473eb","Type":"ContainerStarted","Data":"1a7975ca39237d5d9eb2d3836d4599297e7739292f688daa7da83ca9b896c8df"} Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.484263 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8776b7a-9a4d-41a5-a022-701f97953a5f-utilities\") pod \"redhat-marketplace-tmdd7\" (UID: \"a8776b7a-9a4d-41a5-a022-701f97953a5f\") " pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.484344 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpbp8\" (UniqueName: \"kubernetes.io/projected/a8776b7a-9a4d-41a5-a022-701f97953a5f-kube-api-access-lpbp8\") pod \"redhat-marketplace-tmdd7\" (UID: \"a8776b7a-9a4d-41a5-a022-701f97953a5f\") " pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.484403 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8776b7a-9a4d-41a5-a022-701f97953a5f-catalog-content\") pod \"redhat-marketplace-tmdd7\" (UID: \"a8776b7a-9a4d-41a5-a022-701f97953a5f\") " pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.566259 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2"] Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.582489 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.584819 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2"] Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.586372 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.586421 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8776b7a-9a4d-41a5-a022-701f97953a5f-utilities\") pod \"redhat-marketplace-tmdd7\" (UID: \"a8776b7a-9a4d-41a5-a022-701f97953a5f\") " pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.586491 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpbp8\" (UniqueName: \"kubernetes.io/projected/a8776b7a-9a4d-41a5-a022-701f97953a5f-kube-api-access-lpbp8\") pod \"redhat-marketplace-tmdd7\" (UID: \"a8776b7a-9a4d-41a5-a022-701f97953a5f\") " pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.586490 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.586539 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8776b7a-9a4d-41a5-a022-701f97953a5f-catalog-content\") pod \"redhat-marketplace-tmdd7\" (UID: \"a8776b7a-9a4d-41a5-a022-701f97953a5f\") " pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.586881 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.587020 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8776b7a-9a4d-41a5-a022-701f97953a5f-catalog-content\") pod \"redhat-marketplace-tmdd7\" (UID: \"a8776b7a-9a4d-41a5-a022-701f97953a5f\") " pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.587041 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.587102 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8776b7a-9a4d-41a5-a022-701f97953a5f-utilities\") pod \"redhat-marketplace-tmdd7\" (UID: \"a8776b7a-9a4d-41a5-a022-701f97953a5f\") " pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.587343 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.587506 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.592640 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-44fr6"] Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.601292 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.624975 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpbp8\" (UniqueName: \"kubernetes.io/projected/a8776b7a-9a4d-41a5-a022-701f97953a5f-kube-api-access-lpbp8\") pod \"redhat-marketplace-tmdd7\" (UID: \"a8776b7a-9a4d-41a5-a022-701f97953a5f\") " pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.646992 4764 ???:1] "http: TLS handshake error from 192.168.126.11:47338: no serving certificate available for the kubelet" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.656168 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.687751 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-client-ca\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.687829 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-proxy-ca-bundles\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.687871 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/501d0198-7b72-4711-9422-ea6522ee55ff-serving-cert\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.687906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5q8s\" (UniqueName: \"kubernetes.io/projected/501d0198-7b72-4711-9422-ea6522ee55ff-kube-api-access-s5q8s\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.688039 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-config\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.737390 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.789029 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-client-ca\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.789901 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-proxy-ca-bundles\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.789947 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/501d0198-7b72-4711-9422-ea6522ee55ff-serving-cert\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.790053 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5q8s\" (UniqueName: \"kubernetes.io/projected/501d0198-7b72-4711-9422-ea6522ee55ff-kube-api-access-s5q8s\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.790108 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-config\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.790672 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-client-ca\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.791301 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-config\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.792267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-proxy-ca-bundles\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.798314 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/501d0198-7b72-4711-9422-ea6522ee55ff-serving-cert\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.812065 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5q8s\" (UniqueName: \"kubernetes.io/projected/501d0198-7b72-4711-9422-ea6522ee55ff-kube-api-access-s5q8s\") pod \"controller-manager-7c4b569c4c-qj4w2\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.921369 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.943190 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fzfpf"] Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.944515 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.947788 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.961110 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fzfpf"] Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.979356 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hf2cz" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.995256 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a051746-92b7-4a16-a641-d73888dcfcca-utilities\") pod \"redhat-operators-fzfpf\" (UID: \"4a051746-92b7-4a16-a641-d73888dcfcca\") " pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.995314 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6xz7\" (UniqueName: \"kubernetes.io/projected/4a051746-92b7-4a16-a641-d73888dcfcca-kube-api-access-n6xz7\") pod \"redhat-operators-fzfpf\" (UID: \"4a051746-92b7-4a16-a641-d73888dcfcca\") " pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.995492 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a051746-92b7-4a16-a641-d73888dcfcca-catalog-content\") pod \"redhat-operators-fzfpf\" (UID: \"4a051746-92b7-4a16-a641-d73888dcfcca\") " pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.999468 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:55:03 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:55:03 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:55:03 crc kubenswrapper[4764]: healthz check failed Mar 20 14:55:03 crc kubenswrapper[4764]: I0320 14:55:03.999511 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.001174 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.002526 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.002986 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.006906 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.006956 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.097263 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe980bd7-bc5c-4308-ba3c-a264ab70b88d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fe980bd7-bc5c-4308-ba3c-a264ab70b88d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.099561 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a051746-92b7-4a16-a641-d73888dcfcca-utilities\") pod \"redhat-operators-fzfpf\" (UID: \"4a051746-92b7-4a16-a641-d73888dcfcca\") " pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.100995 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a051746-92b7-4a16-a641-d73888dcfcca-utilities\") pod \"redhat-operators-fzfpf\" (UID: \"4a051746-92b7-4a16-a641-d73888dcfcca\") " pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.101055 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6xz7\" (UniqueName: \"kubernetes.io/projected/4a051746-92b7-4a16-a641-d73888dcfcca-kube-api-access-n6xz7\") pod \"redhat-operators-fzfpf\" (UID: \"4a051746-92b7-4a16-a641-d73888dcfcca\") " pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.102850 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a051746-92b7-4a16-a641-d73888dcfcca-catalog-content\") pod \"redhat-operators-fzfpf\" (UID: \"4a051746-92b7-4a16-a641-d73888dcfcca\") " pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.102929 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe980bd7-bc5c-4308-ba3c-a264ab70b88d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fe980bd7-bc5c-4308-ba3c-a264ab70b88d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.103229 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a051746-92b7-4a16-a641-d73888dcfcca-catalog-content\") pod \"redhat-operators-fzfpf\" (UID: \"4a051746-92b7-4a16-a641-d73888dcfcca\") " pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.129277 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6xz7\" (UniqueName: \"kubernetes.io/projected/4a051746-92b7-4a16-a641-d73888dcfcca-kube-api-access-n6xz7\") pod \"redhat-operators-fzfpf\" (UID: \"4a051746-92b7-4a16-a641-d73888dcfcca\") " pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.162898 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tmdd7"] Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.200093 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2"] Mar 20 14:55:04 crc kubenswrapper[4764]: W0320 14:55:04.200745 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8776b7a_9a4d_41a5_a022_701f97953a5f.slice/crio-c69e6c4bd6f02410db897a0f6478f21fbfa40a6b937491d96f2e17c15df6d02b WatchSource:0}: Error finding container c69e6c4bd6f02410db897a0f6478f21fbfa40a6b937491d96f2e17c15df6d02b: Status 404 returned error can't find the container with id c69e6c4bd6f02410db897a0f6478f21fbfa40a6b937491d96f2e17c15df6d02b Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.204718 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe980bd7-bc5c-4308-ba3c-a264ab70b88d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fe980bd7-bc5c-4308-ba3c-a264ab70b88d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.204804 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe980bd7-bc5c-4308-ba3c-a264ab70b88d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fe980bd7-bc5c-4308-ba3c-a264ab70b88d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.204884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe980bd7-bc5c-4308-ba3c-a264ab70b88d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fe980bd7-bc5c-4308-ba3c-a264ab70b88d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: W0320 14:55:04.208979 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod501d0198_7b72_4711_9422_ea6522ee55ff.slice/crio-2d76e3aa241c8aa96cde7ed8977205d2b2399aa1e46899366d4068c568a710ea WatchSource:0}: Error finding container 2d76e3aa241c8aa96cde7ed8977205d2b2399aa1e46899366d4068c568a710ea: Status 404 returned error can't find the container with id 2d76e3aa241c8aa96cde7ed8977205d2b2399aa1e46899366d4068c568a710ea Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.226159 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe980bd7-bc5c-4308-ba3c-a264ab70b88d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fe980bd7-bc5c-4308-ba3c-a264ab70b88d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.330298 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.340829 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.344172 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4llk9"] Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.345095 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.370160 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4llk9"] Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.383486 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.388240 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-8gwvk" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.393858 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" event={"ID":"fd07d531-e6b9-4b58-9e6c-8012b3a473eb","Type":"ContainerStarted","Data":"87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885"} Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.394417 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.395805 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tmdd7" event={"ID":"a8776b7a-9a4d-41a5-a022-701f97953a5f","Type":"ContainerStarted","Data":"c69e6c4bd6f02410db897a0f6478f21fbfa40a6b937491d96f2e17c15df6d02b"} Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.397072 4764 generic.go:334] "Generic (PLEG): container finished" podID="830768c7-49e2-4ed5-af8e-3762dc00534e" containerID="c0faee55593f47a0854b211b39c799a6af96eef07aaea0e40978334e0faf40fd" exitCode=0 Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.397116 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44fr6" event={"ID":"830768c7-49e2-4ed5-af8e-3762dc00534e","Type":"ContainerDied","Data":"c0faee55593f47a0854b211b39c799a6af96eef07aaea0e40978334e0faf40fd"} Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.397131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44fr6" event={"ID":"830768c7-49e2-4ed5-af8e-3762dc00534e","Type":"ContainerStarted","Data":"07dfdebd1cc7c6971276329cbe403c743b9960a69ae94759aa4101b53c960564"} Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.406843 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n72pk\" (UniqueName: \"kubernetes.io/projected/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-kube-api-access-n72pk\") pod \"redhat-operators-4llk9\" (UID: \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\") " pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.406932 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-utilities\") pod \"redhat-operators-4llk9\" (UID: \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\") " pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.406961 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-catalog-content\") pod \"redhat-operators-4llk9\" (UID: \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\") " pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.406983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" event={"ID":"501d0198-7b72-4711-9422-ea6522ee55ff","Type":"ContainerStarted","Data":"2d76e3aa241c8aa96cde7ed8977205d2b2399aa1e46899366d4068c568a710ea"} Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.452770 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" podStartSLOduration=183.452754028 podStartE2EDuration="3m3.452754028s" podCreationTimestamp="2026-03-20 14:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:55:04.450639424 +0000 UTC m=+226.066828553" watchObservedRunningTime="2026-03-20 14:55:04.452754028 +0000 UTC m=+226.068943157" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.518140 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-utilities\") pod \"redhat-operators-4llk9\" (UID: \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\") " pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.518449 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-catalog-content\") pod \"redhat-operators-4llk9\" (UID: \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\") " pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.518521 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n72pk\" (UniqueName: \"kubernetes.io/projected/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-kube-api-access-n72pk\") pod \"redhat-operators-4llk9\" (UID: \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\") " pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.522095 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-utilities\") pod \"redhat-operators-4llk9\" (UID: \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\") " pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.523685 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-catalog-content\") pod \"redhat-operators-4llk9\" (UID: \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\") " pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.555617 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.580986 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n72pk\" (UniqueName: \"kubernetes.io/projected/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-kube-api-access-n72pk\") pod \"redhat-operators-4llk9\" (UID: \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\") " pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.589348 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.593634 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.596482 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.597619 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.597720 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.601565 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x5nqv" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.613613 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-752qt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.613671 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-752qt" podUID="fe82f329-50db-4717-aa9b-6245253449cf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.613963 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-752qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.613990 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-752qt" podUID="fe82f329-50db-4717-aa9b-6245253449cf" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.619807 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/680186dc-8ec9-4306-aeda-d0bea7fa9a08-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"680186dc-8ec9-4306-aeda-d0bea7fa9a08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.619983 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/680186dc-8ec9-4306-aeda-d0bea7fa9a08-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"680186dc-8ec9-4306-aeda-d0bea7fa9a08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.648037 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.650859 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.662272 4764 patch_prober.go:28] interesting pod/console-f9d7485db-5z97v container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.662319 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5z97v" podUID="82463101-a3d9-4a1b-a180-aba0318fbeb4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.680169 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.728184 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/680186dc-8ec9-4306-aeda-d0bea7fa9a08-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"680186dc-8ec9-4306-aeda-d0bea7fa9a08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.728216 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/680186dc-8ec9-4306-aeda-d0bea7fa9a08-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"680186dc-8ec9-4306-aeda-d0bea7fa9a08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.730894 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/680186dc-8ec9-4306-aeda-d0bea7fa9a08-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"680186dc-8ec9-4306-aeda-d0bea7fa9a08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.755911 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/680186dc-8ec9-4306-aeda-d0bea7fa9a08-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"680186dc-8ec9-4306-aeda-d0bea7fa9a08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.776330 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fzfpf"] Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.934091 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.986190 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.989030 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:55:04 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:55:04 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:55:04 crc kubenswrapper[4764]: healthz check failed Mar 20 14:55:04 crc kubenswrapper[4764]: I0320 14:55:04.989082 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:05 crc kubenswrapper[4764]: I0320 14:55:05.438142 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" event={"ID":"501d0198-7b72-4711-9422-ea6522ee55ff","Type":"ContainerStarted","Data":"28e553a1819b964f7b1ca70ae8af2dff56027018e038aec46159ef82c6f2f2b5"} Mar 20 14:55:05 crc kubenswrapper[4764]: I0320 14:55:05.438360 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:05 crc kubenswrapper[4764]: I0320 14:55:05.442610 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:05 crc kubenswrapper[4764]: I0320 14:55:05.445498 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8776b7a-9a4d-41a5-a022-701f97953a5f" containerID="095b8cb1e49d14aa303847414ed9492cbbf9c54e57840491ff9ca256303c7f87" exitCode=0 Mar 20 14:55:05 crc kubenswrapper[4764]: I0320 14:55:05.445623 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tmdd7" event={"ID":"a8776b7a-9a4d-41a5-a022-701f97953a5f","Type":"ContainerDied","Data":"095b8cb1e49d14aa303847414ed9492cbbf9c54e57840491ff9ca256303c7f87"} Mar 20 14:55:05 crc kubenswrapper[4764]: I0320 14:55:05.453948 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" podStartSLOduration=5.453932856 podStartE2EDuration="5.453932856s" podCreationTimestamp="2026-03-20 14:55:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:55:05.450327108 +0000 UTC m=+227.066516237" watchObservedRunningTime="2026-03-20 14:55:05.453932856 +0000 UTC m=+227.070121985" Mar 20 14:55:05 crc kubenswrapper[4764]: I0320 14:55:05.658580 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:55:05 crc kubenswrapper[4764]: I0320 14:55:05.988049 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:55:05 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:55:05 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:55:05 crc kubenswrapper[4764]: healthz check failed Mar 20 14:55:05 crc kubenswrapper[4764]: I0320 14:55:05.988098 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:06 crc kubenswrapper[4764]: I0320 14:55:06.988093 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:55:06 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:55:06 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:55:06 crc kubenswrapper[4764]: healthz check failed Mar 20 14:55:06 crc kubenswrapper[4764]: I0320 14:55:06.988493 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:07 crc kubenswrapper[4764]: I0320 14:55:07.807024 4764 ???:1] "http: TLS handshake error from 192.168.126.11:47352: no serving certificate available for the kubelet" Mar 20 14:55:07 crc kubenswrapper[4764]: I0320 14:55:07.986827 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:55:07 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:55:07 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:55:07 crc kubenswrapper[4764]: healthz check failed Mar 20 14:55:07 crc kubenswrapper[4764]: I0320 14:55:07.986986 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:08 crc kubenswrapper[4764]: I0320 14:55:08.443326 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:55:08 crc kubenswrapper[4764]: I0320 14:55:08.443373 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:55:08 crc kubenswrapper[4764]: I0320 14:55:08.650523 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 14:55:08 crc kubenswrapper[4764]: I0320 14:55:08.790166 4764 ???:1] "http: TLS handshake error from 192.168.126.11:47354: no serving certificate available for the kubelet" Mar 20 14:55:08 crc kubenswrapper[4764]: I0320 14:55:08.991340 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:55:08 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:55:08 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:55:08 crc kubenswrapper[4764]: healthz check failed Mar 20 14:55:08 crc kubenswrapper[4764]: I0320 14:55:08.991699 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:09 crc kubenswrapper[4764]: I0320 14:55:09.987205 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:55:09 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:55:09 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:55:09 crc kubenswrapper[4764]: healthz check failed Mar 20 14:55:09 crc kubenswrapper[4764]: I0320 14:55:09.987252 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:10 crc kubenswrapper[4764]: I0320 14:55:10.565455 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cs6rb" Mar 20 14:55:10 crc kubenswrapper[4764]: I0320 14:55:10.988047 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:55:10 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:55:10 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:55:10 crc kubenswrapper[4764]: healthz check failed Mar 20 14:55:10 crc kubenswrapper[4764]: I0320 14:55:10.988099 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:11 crc kubenswrapper[4764]: W0320 14:55:11.253998 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a051746_92b7_4a16_a641_d73888dcfcca.slice/crio-c3b7e8fc8712ea940f2df816c005b5ec66ad3a157dc223b0d1f0c821d384a5a5 WatchSource:0}: Error finding container c3b7e8fc8712ea940f2df816c005b5ec66ad3a157dc223b0d1f0c821d384a5a5: Status 404 returned error can't find the container with id c3b7e8fc8712ea940f2df816c005b5ec66ad3a157dc223b0d1f0c821d384a5a5 Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.284583 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.370411 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhgmh\" (UniqueName: \"kubernetes.io/projected/1a20779e-1d3a-4c81-86c7-3248b50c8118-kube-api-access-zhgmh\") pod \"1a20779e-1d3a-4c81-86c7-3248b50c8118\" (UID: \"1a20779e-1d3a-4c81-86c7-3248b50c8118\") " Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.370476 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a20779e-1d3a-4c81-86c7-3248b50c8118-config-volume\") pod \"1a20779e-1d3a-4c81-86c7-3248b50c8118\" (UID: \"1a20779e-1d3a-4c81-86c7-3248b50c8118\") " Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.370577 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a20779e-1d3a-4c81-86c7-3248b50c8118-secret-volume\") pod \"1a20779e-1d3a-4c81-86c7-3248b50c8118\" (UID: \"1a20779e-1d3a-4c81-86c7-3248b50c8118\") " Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.371263 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a20779e-1d3a-4c81-86c7-3248b50c8118-config-volume" (OuterVolumeSpecName: "config-volume") pod "1a20779e-1d3a-4c81-86c7-3248b50c8118" (UID: "1a20779e-1d3a-4c81-86c7-3248b50c8118"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.379003 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a20779e-1d3a-4c81-86c7-3248b50c8118-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1a20779e-1d3a-4c81-86c7-3248b50c8118" (UID: "1a20779e-1d3a-4c81-86c7-3248b50c8118"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.380857 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a20779e-1d3a-4c81-86c7-3248b50c8118-kube-api-access-zhgmh" (OuterVolumeSpecName: "kube-api-access-zhgmh") pod "1a20779e-1d3a-4c81-86c7-3248b50c8118" (UID: "1a20779e-1d3a-4c81-86c7-3248b50c8118"). InnerVolumeSpecName "kube-api-access-zhgmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.475246 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.480082 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhgmh\" (UniqueName: \"kubernetes.io/projected/1a20779e-1d3a-4c81-86c7-3248b50c8118-kube-api-access-zhgmh\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.480466 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a20779e-1d3a-4c81-86c7-3248b50c8118-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.480477 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a20779e-1d3a-4c81-86c7-3248b50c8118-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.486997 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzfpf" event={"ID":"4a051746-92b7-4a16-a641-d73888dcfcca","Type":"ContainerStarted","Data":"c3b7e8fc8712ea940f2df816c005b5ec66ad3a157dc223b0d1f0c821d384a5a5"} Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.488725 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" event={"ID":"1a20779e-1d3a-4c81-86c7-3248b50c8118","Type":"ContainerDied","Data":"655641a7fb6341df89e5739b1fde5b70996487251a706f9b8c34449a7c833833"} Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.488771 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4" Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.488856 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="655641a7fb6341df89e5739b1fde5b70996487251a706f9b8c34449a7c833833" Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.989558 4764 patch_prober.go:28] interesting pod/router-default-5444994796-jpz9j container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 14:55:11 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Mar 20 14:55:11 crc kubenswrapper[4764]: [+]process-running ok Mar 20 14:55:11 crc kubenswrapper[4764]: healthz check failed Mar 20 14:55:11 crc kubenswrapper[4764]: I0320 14:55:11.989616 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jpz9j" podUID="fb145565-51bb-4217-b1c0-fec824da2124" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:55:12 crc kubenswrapper[4764]: I0320 14:55:12.988686 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:55:12 crc kubenswrapper[4764]: I0320 14:55:12.998769 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jpz9j" Mar 20 14:55:14 crc kubenswrapper[4764]: I0320 14:55:14.619671 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-752qt" Mar 20 14:55:14 crc kubenswrapper[4764]: I0320 14:55:14.653813 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:55:14 crc kubenswrapper[4764]: I0320 14:55:14.666330 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-5z97v" Mar 20 14:55:15 crc kubenswrapper[4764]: W0320 14:55:15.645292 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfe980bd7_bc5c_4308_ba3c_a264ab70b88d.slice/crio-3811b07ee51e95c4a0102895148031c299cc01059e37b8ca53795cf2ec532c04 WatchSource:0}: Error finding container 3811b07ee51e95c4a0102895148031c299cc01059e37b8ca53795cf2ec532c04: Status 404 returned error can't find the container with id 3811b07ee51e95c4a0102895148031c299cc01059e37b8ca53795cf2ec532c04 Mar 20 14:55:16 crc kubenswrapper[4764]: I0320 14:55:16.487812 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 14:55:16 crc kubenswrapper[4764]: I0320 14:55:16.521178 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fe980bd7-bc5c-4308-ba3c-a264ab70b88d","Type":"ContainerStarted","Data":"3811b07ee51e95c4a0102895148031c299cc01059e37b8ca53795cf2ec532c04"} Mar 20 14:55:19 crc kubenswrapper[4764]: I0320 14:55:19.175411 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2"] Mar 20 14:55:19 crc kubenswrapper[4764]: I0320 14:55:19.176366 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" podUID="501d0198-7b72-4711-9422-ea6522ee55ff" containerName="controller-manager" containerID="cri-o://28e553a1819b964f7b1ca70ae8af2dff56027018e038aec46159ef82c6f2f2b5" gracePeriod=30 Mar 20 14:55:19 crc kubenswrapper[4764]: I0320 14:55:19.186443 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw"] Mar 20 14:55:19 crc kubenswrapper[4764]: I0320 14:55:19.186721 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" podUID="9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4" containerName="route-controller-manager" containerID="cri-o://9dee3f4deabe1fac787267cd725a2a59ac712a81f2b1fcb3c241ce52684b79e7" gracePeriod=30 Mar 20 14:55:19 crc kubenswrapper[4764]: I0320 14:55:19.539922 4764 generic.go:334] "Generic (PLEG): container finished" podID="9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4" containerID="9dee3f4deabe1fac787267cd725a2a59ac712a81f2b1fcb3c241ce52684b79e7" exitCode=0 Mar 20 14:55:19 crc kubenswrapper[4764]: I0320 14:55:19.539988 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" event={"ID":"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4","Type":"ContainerDied","Data":"9dee3f4deabe1fac787267cd725a2a59ac712a81f2b1fcb3c241ce52684b79e7"} Mar 20 14:55:19 crc kubenswrapper[4764]: I0320 14:55:19.541829 4764 generic.go:334] "Generic (PLEG): container finished" podID="501d0198-7b72-4711-9422-ea6522ee55ff" containerID="28e553a1819b964f7b1ca70ae8af2dff56027018e038aec46159ef82c6f2f2b5" exitCode=0 Mar 20 14:55:19 crc kubenswrapper[4764]: I0320 14:55:19.541872 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" event={"ID":"501d0198-7b72-4711-9422-ea6522ee55ff","Type":"ContainerDied","Data":"28e553a1819b964f7b1ca70ae8af2dff56027018e038aec46159ef82c6f2f2b5"} Mar 20 14:55:22 crc kubenswrapper[4764]: I0320 14:55:22.458466 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4llk9"] Mar 20 14:55:22 crc kubenswrapper[4764]: I0320 14:55:22.802798 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:55:22 crc kubenswrapper[4764]: E0320 14:55:22.919560 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 14:55:22 crc kubenswrapper[4764]: E0320 14:55:22.919951 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:55:22 crc kubenswrapper[4764]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 14:55:22 crc kubenswrapper[4764]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-486kc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566974-5687l_openshift-infra(1f3afda6-923c-403a-994d-996da0ad0fee): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 14:55:22 crc kubenswrapper[4764]: > logger="UnhandledError" Mar 20 14:55:22 crc kubenswrapper[4764]: E0320 14:55:22.921107 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566974-5687l" podUID="1f3afda6-923c-403a-994d-996da0ad0fee" Mar 20 14:55:23 crc kubenswrapper[4764]: E0320 14:55:23.577168 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566974-5687l" podUID="1f3afda6-923c-403a-994d-996da0ad0fee" Mar 20 14:55:23 crc kubenswrapper[4764]: I0320 14:55:23.895028 4764 patch_prober.go:28] interesting pod/route-controller-manager-6dc864f5b4-n2rqw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:55:23 crc kubenswrapper[4764]: I0320 14:55:23.895113 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" podUID="9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:55:24 crc kubenswrapper[4764]: I0320 14:55:24.923022 4764 patch_prober.go:28] interesting pod/controller-manager-7c4b569c4c-qj4w2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:55:24 crc kubenswrapper[4764]: I0320 14:55:24.923192 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" podUID="501d0198-7b72-4711-9422-ea6522ee55ff" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:55:25 crc kubenswrapper[4764]: E0320 14:55:25.303311 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 14:55:25 crc kubenswrapper[4764]: E0320 14:55:25.303630 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n76fb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-f7m8j_openshift-marketplace(548554c3-21d2-4406-a509-e80303628f56): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 14:55:25 crc kubenswrapper[4764]: E0320 14:55:25.305020 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-f7m8j" podUID="548554c3-21d2-4406-a509-e80303628f56" Mar 20 14:55:25 crc kubenswrapper[4764]: I0320 14:55:25.457889 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 14:55:29 crc kubenswrapper[4764]: I0320 14:55:29.301675 4764 ???:1] "http: TLS handshake error from 192.168.126.11:49120: no serving certificate available for the kubelet" Mar 20 14:55:30 crc kubenswrapper[4764]: E0320 14:55:30.933343 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-f7m8j" podUID="548554c3-21d2-4406-a509-e80303628f56" Mar 20 14:55:30 crc kubenswrapper[4764]: W0320 14:55:30.934289 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf21ed6b8_e9f9_4d40_8700_77c6b3919a4b.slice/crio-e0b342f4464ee6f811f4f7ded429693f3eba70521e85b27e91beca253282af2b WatchSource:0}: Error finding container e0b342f4464ee6f811f4f7ded429693f3eba70521e85b27e91beca253282af2b: Status 404 returned error can't find the container with id e0b342f4464ee6f811f4f7ded429693f3eba70521e85b27e91beca253282af2b Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.021302 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.027708 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.064419 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv"] Mar 20 14:55:31 crc kubenswrapper[4764]: E0320 14:55:31.064689 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501d0198-7b72-4711-9422-ea6522ee55ff" containerName="controller-manager" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.064704 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="501d0198-7b72-4711-9422-ea6522ee55ff" containerName="controller-manager" Mar 20 14:55:31 crc kubenswrapper[4764]: E0320 14:55:31.064720 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4" containerName="route-controller-manager" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.064728 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4" containerName="route-controller-manager" Mar 20 14:55:31 crc kubenswrapper[4764]: E0320 14:55:31.064749 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a20779e-1d3a-4c81-86c7-3248b50c8118" containerName="collect-profiles" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.064758 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a20779e-1d3a-4c81-86c7-3248b50c8118" containerName="collect-profiles" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.064880 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="501d0198-7b72-4711-9422-ea6522ee55ff" containerName="controller-manager" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.064897 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4" containerName="route-controller-manager" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.064913 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a20779e-1d3a-4c81-86c7-3248b50c8118" containerName="collect-profiles" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.065357 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.071930 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv"] Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.107814 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-client-ca\") pod \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.107889 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-config\") pod \"501d0198-7b72-4711-9422-ea6522ee55ff\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.107956 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-proxy-ca-bundles\") pod \"501d0198-7b72-4711-9422-ea6522ee55ff\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.107986 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfcfx\" (UniqueName: \"kubernetes.io/projected/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-kube-api-access-pfcfx\") pod \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.108015 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-client-ca\") pod \"501d0198-7b72-4711-9422-ea6522ee55ff\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.108058 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/501d0198-7b72-4711-9422-ea6522ee55ff-serving-cert\") pod \"501d0198-7b72-4711-9422-ea6522ee55ff\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.108122 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-serving-cert\") pod \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.108184 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5q8s\" (UniqueName: \"kubernetes.io/projected/501d0198-7b72-4711-9422-ea6522ee55ff-kube-api-access-s5q8s\") pod \"501d0198-7b72-4711-9422-ea6522ee55ff\" (UID: \"501d0198-7b72-4711-9422-ea6522ee55ff\") " Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.108219 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-config\") pod \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\" (UID: \"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4\") " Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.108506 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm9bw\" (UniqueName: \"kubernetes.io/projected/0099134f-b810-4235-8d81-a0dfba45f3b9-kube-api-access-sm9bw\") pod \"route-controller-manager-7698bbdc48-bmtzv\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.108574 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0099134f-b810-4235-8d81-a0dfba45f3b9-client-ca\") pod \"route-controller-manager-7698bbdc48-bmtzv\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.108612 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0099134f-b810-4235-8d81-a0dfba45f3b9-config\") pod \"route-controller-manager-7698bbdc48-bmtzv\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.108696 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0099134f-b810-4235-8d81-a0dfba45f3b9-serving-cert\") pod \"route-controller-manager-7698bbdc48-bmtzv\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.109800 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-client-ca" (OuterVolumeSpecName: "client-ca") pod "9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4" (UID: "9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.110762 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-config" (OuterVolumeSpecName: "config") pod "501d0198-7b72-4711-9422-ea6522ee55ff" (UID: "501d0198-7b72-4711-9422-ea6522ee55ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.111633 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "501d0198-7b72-4711-9422-ea6522ee55ff" (UID: "501d0198-7b72-4711-9422-ea6522ee55ff"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.114579 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-config" (OuterVolumeSpecName: "config") pod "9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4" (UID: "9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.115467 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-client-ca" (OuterVolumeSpecName: "client-ca") pod "501d0198-7b72-4711-9422-ea6522ee55ff" (UID: "501d0198-7b72-4711-9422-ea6522ee55ff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.118829 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/501d0198-7b72-4711-9422-ea6522ee55ff-kube-api-access-s5q8s" (OuterVolumeSpecName: "kube-api-access-s5q8s") pod "501d0198-7b72-4711-9422-ea6522ee55ff" (UID: "501d0198-7b72-4711-9422-ea6522ee55ff"). InnerVolumeSpecName "kube-api-access-s5q8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.119419 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501d0198-7b72-4711-9422-ea6522ee55ff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "501d0198-7b72-4711-9422-ea6522ee55ff" (UID: "501d0198-7b72-4711-9422-ea6522ee55ff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.120120 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4" (UID: "9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.127546 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-kube-api-access-pfcfx" (OuterVolumeSpecName: "kube-api-access-pfcfx") pod "9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4" (UID: "9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4"). InnerVolumeSpecName "kube-api-access-pfcfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.209709 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0099134f-b810-4235-8d81-a0dfba45f3b9-serving-cert\") pod \"route-controller-manager-7698bbdc48-bmtzv\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.209800 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm9bw\" (UniqueName: \"kubernetes.io/projected/0099134f-b810-4235-8d81-a0dfba45f3b9-kube-api-access-sm9bw\") pod \"route-controller-manager-7698bbdc48-bmtzv\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.209857 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0099134f-b810-4235-8d81-a0dfba45f3b9-client-ca\") pod \"route-controller-manager-7698bbdc48-bmtzv\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.209885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0099134f-b810-4235-8d81-a0dfba45f3b9-config\") pod \"route-controller-manager-7698bbdc48-bmtzv\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.209953 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.209968 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.209980 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.209994 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfcfx\" (UniqueName: \"kubernetes.io/projected/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-kube-api-access-pfcfx\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.210005 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/501d0198-7b72-4711-9422-ea6522ee55ff-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.210016 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/501d0198-7b72-4711-9422-ea6522ee55ff-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.210029 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.210041 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5q8s\" (UniqueName: \"kubernetes.io/projected/501d0198-7b72-4711-9422-ea6522ee55ff-kube-api-access-s5q8s\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.210053 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.211502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0099134f-b810-4235-8d81-a0dfba45f3b9-config\") pod \"route-controller-manager-7698bbdc48-bmtzv\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.212306 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0099134f-b810-4235-8d81-a0dfba45f3b9-client-ca\") pod \"route-controller-manager-7698bbdc48-bmtzv\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.215605 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0099134f-b810-4235-8d81-a0dfba45f3b9-serving-cert\") pod \"route-controller-manager-7698bbdc48-bmtzv\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.224975 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm9bw\" (UniqueName: \"kubernetes.io/projected/0099134f-b810-4235-8d81-a0dfba45f3b9-kube-api-access-sm9bw\") pod \"route-controller-manager-7698bbdc48-bmtzv\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.390000 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.628522 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.628756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw" event={"ID":"9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4","Type":"ContainerDied","Data":"6203773ddd96936bb832d1aef7e2bfc7cf5f286cda9c8374efe31e0750fd4e72"} Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.628934 4764 scope.go:117] "RemoveContainer" containerID="9dee3f4deabe1fac787267cd725a2a59ac712a81f2b1fcb3c241ce52684b79e7" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.633170 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4llk9" event={"ID":"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b","Type":"ContainerStarted","Data":"e0b342f4464ee6f811f4f7ded429693f3eba70521e85b27e91beca253282af2b"} Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.638723 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" event={"ID":"501d0198-7b72-4711-9422-ea6522ee55ff","Type":"ContainerDied","Data":"2d76e3aa241c8aa96cde7ed8977205d2b2399aa1e46899366d4068c568a710ea"} Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.638977 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2" Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.665577 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw"] Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.674275 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dc864f5b4-n2rqw"] Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.681490 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2"] Mar 20 14:55:31 crc kubenswrapper[4764]: I0320 14:55:31.685839 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c4b569c4c-qj4w2"] Mar 20 14:55:32 crc kubenswrapper[4764]: E0320 14:55:32.005029 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 14:55:32 crc kubenswrapper[4764]: E0320 14:55:32.005241 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tjz2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hvnpx_openshift-marketplace(6990bd44-c839-4dcd-bb4a-8d9da96bf644): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 14:55:32 crc kubenswrapper[4764]: E0320 14:55:32.007176 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hvnpx" podUID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" Mar 20 14:55:32 crc kubenswrapper[4764]: E0320 14:55:32.165141 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 14:55:32 crc kubenswrapper[4764]: E0320 14:55:32.165283 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hmzg8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-nphvv_openshift-marketplace(429a82b0-5f61-4d42-a0d2-2fcb566f0bcc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 14:55:32 crc kubenswrapper[4764]: E0320 14:55:32.166678 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-nphvv" podUID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" Mar 20 14:55:32 crc kubenswrapper[4764]: E0320 14:55:32.576445 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 14:55:32 crc kubenswrapper[4764]: E0320 14:55:32.576643 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pb8r2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-cmxv2_openshift-marketplace(67e76e77-4199-4fdd-b755-10cab62e1370): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 14:55:32 crc kubenswrapper[4764]: E0320 14:55:32.577816 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-cmxv2" podUID="67e76e77-4199-4fdd-b755-10cab62e1370" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.134982 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="501d0198-7b72-4711-9422-ea6522ee55ff" path="/var/lib/kubelet/pods/501d0198-7b72-4711-9422-ea6522ee55ff/volumes" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.135652 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4" path="/var/lib/kubelet/pods/9551a0a7-c08e-47dc-ae7a-f1c9c8b1f9f4/volumes" Mar 20 14:55:33 crc kubenswrapper[4764]: E0320 14:55:33.286795 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hvnpx" podUID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" Mar 20 14:55:33 crc kubenswrapper[4764]: E0320 14:55:33.286969 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-nphvv" podUID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" Mar 20 14:55:33 crc kubenswrapper[4764]: E0320 14:55:33.288405 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-cmxv2" podUID="67e76e77-4199-4fdd-b755-10cab62e1370" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.320918 4764 scope.go:117] "RemoveContainer" containerID="28e553a1819b964f7b1ca70ae8af2dff56027018e038aec46159ef82c6f2f2b5" Mar 20 14:55:33 crc kubenswrapper[4764]: E0320 14:55:33.321812 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 14:55:33 crc kubenswrapper[4764]: E0320 14:55:33.322809 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dj459,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-44fr6_openshift-marketplace(830768c7-49e2-4ed5-af8e-3762dc00534e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 14:55:33 crc kubenswrapper[4764]: E0320 14:55:33.324734 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-44fr6" podUID="830768c7-49e2-4ed5-af8e-3762dc00534e" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.582028 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bdb574b79-7vfgz"] Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.583633 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.586421 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.588221 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.588529 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.588756 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.589063 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.589212 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.595706 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bdb574b79-7vfgz"] Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.596801 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.638164 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-config\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.638508 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gq8w\" (UniqueName: \"kubernetes.io/projected/4b584f21-f79c-4bd6-937e-becba3cc91e6-kube-api-access-4gq8w\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.638580 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-client-ca\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.638611 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-proxy-ca-bundles\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.638740 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b584f21-f79c-4bd6-937e-becba3cc91e6-serving-cert\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.648813 4764 generic.go:334] "Generic (PLEG): container finished" podID="f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" containerID="2bdc0a2a551c8d22b98cfe9b4eb4bb1bfd209dd40d6a6860c7e233b9449a4d02" exitCode=0 Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.648858 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4llk9" event={"ID":"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b","Type":"ContainerDied","Data":"2bdc0a2a551c8d22b98cfe9b4eb4bb1bfd209dd40d6a6860c7e233b9449a4d02"} Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.649680 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"680186dc-8ec9-4306-aeda-d0bea7fa9a08","Type":"ContainerStarted","Data":"55181352f2bc152630b552a4537ee19433026c76eae638e630f6d78e26014392"} Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.655556 4764 generic.go:334] "Generic (PLEG): container finished" podID="4a051746-92b7-4a16-a641-d73888dcfcca" containerID="f70cad435924bb94f14bff90c4487561a5322a941d008a84edc5d8b1cc212fb4" exitCode=0 Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.655596 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzfpf" event={"ID":"4a051746-92b7-4a16-a641-d73888dcfcca","Type":"ContainerDied","Data":"f70cad435924bb94f14bff90c4487561a5322a941d008a84edc5d8b1cc212fb4"} Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.658036 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fe980bd7-bc5c-4308-ba3c-a264ab70b88d","Type":"ContainerStarted","Data":"3e6e4abb191be63bdeed9ab0e7d6f3ba51a0970f2d6ac14c40f64719e9dd805c"} Mar 20 14:55:33 crc kubenswrapper[4764]: E0320 14:55:33.679799 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-44fr6" podUID="830768c7-49e2-4ed5-af8e-3762dc00534e" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.681109 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=30.681092972 podStartE2EDuration="30.681092972s" podCreationTimestamp="2026-03-20 14:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:55:33.677777045 +0000 UTC m=+255.293966194" watchObservedRunningTime="2026-03-20 14:55:33.681092972 +0000 UTC m=+255.297282101" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.739955 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-config\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.740021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gq8w\" (UniqueName: \"kubernetes.io/projected/4b584f21-f79c-4bd6-937e-becba3cc91e6-kube-api-access-4gq8w\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.740094 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-client-ca\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.740120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-proxy-ca-bundles\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.740179 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b584f21-f79c-4bd6-937e-becba3cc91e6-serving-cert\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.741678 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-config\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.741836 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-proxy-ca-bundles\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.742297 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-client-ca\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.745698 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b584f21-f79c-4bd6-937e-becba3cc91e6-serving-cert\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.759075 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gq8w\" (UniqueName: \"kubernetes.io/projected/4b584f21-f79c-4bd6-937e-becba3cc91e6-kube-api-access-4gq8w\") pod \"controller-manager-6bdb574b79-7vfgz\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:33 crc kubenswrapper[4764]: I0320 14:55:33.873780 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv"] Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.007486 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.188754 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bdb574b79-7vfgz"] Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.665337 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" event={"ID":"4b584f21-f79c-4bd6-937e-becba3cc91e6","Type":"ContainerStarted","Data":"976f98778422af3918d1d9a7060837e2e77955515bc40e062dc04cb0b24b3d94"} Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.665610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" event={"ID":"4b584f21-f79c-4bd6-937e-becba3cc91e6","Type":"ContainerStarted","Data":"457fd46deefd1e119c6a0b5fd3a5218f8f8525bdf5a51852fc7d427218dd8d4a"} Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.665930 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.671403 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.676495 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe980bd7-bc5c-4308-ba3c-a264ab70b88d" containerID="3e6e4abb191be63bdeed9ab0e7d6f3ba51a0970f2d6ac14c40f64719e9dd805c" exitCode=0 Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.676679 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fe980bd7-bc5c-4308-ba3c-a264ab70b88d","Type":"ContainerDied","Data":"3e6e4abb191be63bdeed9ab0e7d6f3ba51a0970f2d6ac14c40f64719e9dd805c"} Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.683039 4764 generic.go:334] "Generic (PLEG): container finished" podID="680186dc-8ec9-4306-aeda-d0bea7fa9a08" containerID="5cafb7ab793db0ce88f3aa890d63c92ddb74bd9f14690c1156301c0df374b6b7" exitCode=0 Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.683110 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"680186dc-8ec9-4306-aeda-d0bea7fa9a08","Type":"ContainerDied","Data":"5cafb7ab793db0ce88f3aa890d63c92ddb74bd9f14690c1156301c0df374b6b7"} Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.686315 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8776b7a-9a4d-41a5-a022-701f97953a5f" containerID="e70a4d5ec8cb22a044089c851cd83dbadc2b34fac3f6226b99a465a98a2ca5b6" exitCode=0 Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.686390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tmdd7" event={"ID":"a8776b7a-9a4d-41a5-a022-701f97953a5f","Type":"ContainerDied","Data":"e70a4d5ec8cb22a044089c851cd83dbadc2b34fac3f6226b99a465a98a2ca5b6"} Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.688464 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" event={"ID":"0099134f-b810-4235-8d81-a0dfba45f3b9","Type":"ContainerStarted","Data":"e55e344eacddcb4407d46a6d4f9ff8b3b0d7a24b57932227e6bf0a3ce85d5b13"} Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.688490 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" event={"ID":"0099134f-b810-4235-8d81-a0dfba45f3b9","Type":"ContainerStarted","Data":"0046369694da55adc0651d58c598129e6ee5eb2fa2c021d10de0df248487c697"} Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.688960 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.693038 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" podStartSLOduration=15.693026759 podStartE2EDuration="15.693026759s" podCreationTimestamp="2026-03-20 14:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:55:34.681322946 +0000 UTC m=+256.297512085" watchObservedRunningTime="2026-03-20 14:55:34.693026759 +0000 UTC m=+256.309215888" Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.697089 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:34 crc kubenswrapper[4764]: I0320 14:55:34.769869 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" podStartSLOduration=15.769847426 podStartE2EDuration="15.769847426s" podCreationTimestamp="2026-03-20 14:55:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:55:34.748282294 +0000 UTC m=+256.364471423" watchObservedRunningTime="2026-03-20 14:55:34.769847426 +0000 UTC m=+256.386036555" Mar 20 14:55:35 crc kubenswrapper[4764]: I0320 14:55:35.506158 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cnd46" Mar 20 14:55:35 crc kubenswrapper[4764]: I0320 14:55:35.696272 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566974-5687l" event={"ID":"1f3afda6-923c-403a-994d-996da0ad0fee","Type":"ContainerStarted","Data":"1b9deeba3ca0056c2b95e510afaddba5c76a8abf7b5acaf54d959db06f192793"} Mar 20 14:55:35 crc kubenswrapper[4764]: I0320 14:55:35.701066 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tmdd7" event={"ID":"a8776b7a-9a4d-41a5-a022-701f97953a5f","Type":"ContainerStarted","Data":"c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce"} Mar 20 14:55:35 crc kubenswrapper[4764]: I0320 14:55:35.715725 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566974-5687l" podStartSLOduration=56.690794385 podStartE2EDuration="1m35.715705526s" podCreationTimestamp="2026-03-20 14:54:00 +0000 UTC" firstStartedPulling="2026-03-20 14:54:56.251338234 +0000 UTC m=+217.867527363" lastFinishedPulling="2026-03-20 14:55:35.276249385 +0000 UTC m=+256.892438504" observedRunningTime="2026-03-20 14:55:35.712797223 +0000 UTC m=+257.328986352" watchObservedRunningTime="2026-03-20 14:55:35.715705526 +0000 UTC m=+257.331894655" Mar 20 14:55:35 crc kubenswrapper[4764]: I0320 14:55:35.737162 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tmdd7" podStartSLOduration=8.832754383 podStartE2EDuration="32.737130904s" podCreationTimestamp="2026-03-20 14:55:03 +0000 UTC" firstStartedPulling="2026-03-20 14:55:11.246634945 +0000 UTC m=+232.862824074" lastFinishedPulling="2026-03-20 14:55:35.151011466 +0000 UTC m=+256.767200595" observedRunningTime="2026-03-20 14:55:35.733857499 +0000 UTC m=+257.350046628" watchObservedRunningTime="2026-03-20 14:55:35.737130904 +0000 UTC m=+257.353320033" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.006588 4764 csr.go:261] certificate signing request csr-cnn2q is approved, waiting to be issued Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.014793 4764 csr.go:257] certificate signing request csr-cnn2q is issued Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.022183 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.071132 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/680186dc-8ec9-4306-aeda-d0bea7fa9a08-kube-api-access\") pod \"680186dc-8ec9-4306-aeda-d0bea7fa9a08\" (UID: \"680186dc-8ec9-4306-aeda-d0bea7fa9a08\") " Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.071218 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/680186dc-8ec9-4306-aeda-d0bea7fa9a08-kubelet-dir\") pod \"680186dc-8ec9-4306-aeda-d0bea7fa9a08\" (UID: \"680186dc-8ec9-4306-aeda-d0bea7fa9a08\") " Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.071817 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/680186dc-8ec9-4306-aeda-d0bea7fa9a08-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "680186dc-8ec9-4306-aeda-d0bea7fa9a08" (UID: "680186dc-8ec9-4306-aeda-d0bea7fa9a08"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.084789 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680186dc-8ec9-4306-aeda-d0bea7fa9a08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "680186dc-8ec9-4306-aeda-d0bea7fa9a08" (UID: "680186dc-8ec9-4306-aeda-d0bea7fa9a08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.156983 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.172440 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe980bd7-bc5c-4308-ba3c-a264ab70b88d-kube-api-access\") pod \"fe980bd7-bc5c-4308-ba3c-a264ab70b88d\" (UID: \"fe980bd7-bc5c-4308-ba3c-a264ab70b88d\") " Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.172493 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe980bd7-bc5c-4308-ba3c-a264ab70b88d-kubelet-dir\") pod \"fe980bd7-bc5c-4308-ba3c-a264ab70b88d\" (UID: \"fe980bd7-bc5c-4308-ba3c-a264ab70b88d\") " Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.172708 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/680186dc-8ec9-4306-aeda-d0bea7fa9a08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.172723 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/680186dc-8ec9-4306-aeda-d0bea7fa9a08-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.172765 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe980bd7-bc5c-4308-ba3c-a264ab70b88d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fe980bd7-bc5c-4308-ba3c-a264ab70b88d" (UID: "fe980bd7-bc5c-4308-ba3c-a264ab70b88d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.175805 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe980bd7-bc5c-4308-ba3c-a264ab70b88d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fe980bd7-bc5c-4308-ba3c-a264ab70b88d" (UID: "fe980bd7-bc5c-4308-ba3c-a264ab70b88d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.273775 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe980bd7-bc5c-4308-ba3c-a264ab70b88d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.273809 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe980bd7-bc5c-4308-ba3c-a264ab70b88d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.712073 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fe980bd7-bc5c-4308-ba3c-a264ab70b88d","Type":"ContainerDied","Data":"3811b07ee51e95c4a0102895148031c299cc01059e37b8ca53795cf2ec532c04"} Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.712114 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3811b07ee51e95c4a0102895148031c299cc01059e37b8ca53795cf2ec532c04" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.712163 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.716960 4764 generic.go:334] "Generic (PLEG): container finished" podID="1f3afda6-923c-403a-994d-996da0ad0fee" containerID="1b9deeba3ca0056c2b95e510afaddba5c76a8abf7b5acaf54d959db06f192793" exitCode=0 Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.717034 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566974-5687l" event={"ID":"1f3afda6-923c-403a-994d-996da0ad0fee","Type":"ContainerDied","Data":"1b9deeba3ca0056c2b95e510afaddba5c76a8abf7b5acaf54d959db06f192793"} Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.718368 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"680186dc-8ec9-4306-aeda-d0bea7fa9a08","Type":"ContainerDied","Data":"55181352f2bc152630b552a4537ee19433026c76eae638e630f6d78e26014392"} Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.718445 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55181352f2bc152630b552a4537ee19433026c76eae638e630f6d78e26014392" Mar 20 14:55:36 crc kubenswrapper[4764]: I0320 14:55:36.718492 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.017126 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-26 22:54:59.954676873 +0000 UTC Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.017171 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6751h59m22.93750953s for next certificate rotation Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.568449 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 14:55:37 crc kubenswrapper[4764]: E0320 14:55:37.569194 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="680186dc-8ec9-4306-aeda-d0bea7fa9a08" containerName="pruner" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.569204 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="680186dc-8ec9-4306-aeda-d0bea7fa9a08" containerName="pruner" Mar 20 14:55:37 crc kubenswrapper[4764]: E0320 14:55:37.569213 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe980bd7-bc5c-4308-ba3c-a264ab70b88d" containerName="pruner" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.569219 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe980bd7-bc5c-4308-ba3c-a264ab70b88d" containerName="pruner" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.569319 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe980bd7-bc5c-4308-ba3c-a264ab70b88d" containerName="pruner" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.569330 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="680186dc-8ec9-4306-aeda-d0bea7fa9a08" containerName="pruner" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.569680 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.576610 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.604023 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.604362 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.604771 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0aad2eb4-a84b-41f9-88bf-c306f3735b78-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0aad2eb4-a84b-41f9-88bf-c306f3735b78\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.604809 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aad2eb4-a84b-41f9-88bf-c306f3735b78-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0aad2eb4-a84b-41f9-88bf-c306f3735b78\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.706523 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0aad2eb4-a84b-41f9-88bf-c306f3735b78-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0aad2eb4-a84b-41f9-88bf-c306f3735b78\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.706665 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aad2eb4-a84b-41f9-88bf-c306f3735b78-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0aad2eb4-a84b-41f9-88bf-c306f3735b78\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.707051 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0aad2eb4-a84b-41f9-88bf-c306f3735b78-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0aad2eb4-a84b-41f9-88bf-c306f3735b78\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.727058 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aad2eb4-a84b-41f9-88bf-c306f3735b78-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0aad2eb4-a84b-41f9-88bf-c306f3735b78\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 14:55:37 crc kubenswrapper[4764]: I0320 14:55:37.927104 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.015407 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566974-5687l" Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.018292 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-16 22:56:48.901383245 +0000 UTC Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.018316 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6512h1m10.883068946s for next certificate rotation Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.111652 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-486kc\" (UniqueName: \"kubernetes.io/projected/1f3afda6-923c-403a-994d-996da0ad0fee-kube-api-access-486kc\") pod \"1f3afda6-923c-403a-994d-996da0ad0fee\" (UID: \"1f3afda6-923c-403a-994d-996da0ad0fee\") " Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.116445 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f3afda6-923c-403a-994d-996da0ad0fee-kube-api-access-486kc" (OuterVolumeSpecName: "kube-api-access-486kc") pod "1f3afda6-923c-403a-994d-996da0ad0fee" (UID: "1f3afda6-923c-403a-994d-996da0ad0fee"). InnerVolumeSpecName "kube-api-access-486kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.213072 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-486kc\" (UniqueName: \"kubernetes.io/projected/1f3afda6-923c-403a-994d-996da0ad0fee-kube-api-access-486kc\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.342188 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.443848 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.443902 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.754890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0aad2eb4-a84b-41f9-88bf-c306f3735b78","Type":"ContainerStarted","Data":"03556fdd569626c51f057e15533f343ba2ba651bacd84e135919ff9fba579e2c"} Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.755270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0aad2eb4-a84b-41f9-88bf-c306f3735b78","Type":"ContainerStarted","Data":"df4ca9445031c6c4dc1e4dbde1fc55f60bce7aca2ef4d435c13cb48b1d3a8625"} Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.759510 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566974-5687l" event={"ID":"1f3afda6-923c-403a-994d-996da0ad0fee","Type":"ContainerDied","Data":"18a0e0fd711d12be5c8a1d59575d9c152e3271d36a5e4cfe99295b45d930f004"} Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.759537 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18a0e0fd711d12be5c8a1d59575d9c152e3271d36a5e4cfe99295b45d930f004" Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.759708 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566974-5687l" Mar 20 14:55:38 crc kubenswrapper[4764]: I0320 14:55:38.778659 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.778638998 podStartE2EDuration="1.778638998s" podCreationTimestamp="2026-03-20 14:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:55:38.769001137 +0000 UTC m=+260.385190266" watchObservedRunningTime="2026-03-20 14:55:38.778638998 +0000 UTC m=+260.394828117" Mar 20 14:55:39 crc kubenswrapper[4764]: I0320 14:55:39.095424 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bdb574b79-7vfgz"] Mar 20 14:55:39 crc kubenswrapper[4764]: I0320 14:55:39.095615 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" podUID="4b584f21-f79c-4bd6-937e-becba3cc91e6" containerName="controller-manager" containerID="cri-o://976f98778422af3918d1d9a7060837e2e77955515bc40e062dc04cb0b24b3d94" gracePeriod=30 Mar 20 14:55:39 crc kubenswrapper[4764]: I0320 14:55:39.212961 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv"] Mar 20 14:55:39 crc kubenswrapper[4764]: I0320 14:55:39.213645 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" podUID="0099134f-b810-4235-8d81-a0dfba45f3b9" containerName="route-controller-manager" containerID="cri-o://e55e344eacddcb4407d46a6d4f9ff8b3b0d7a24b57932227e6bf0a3ce85d5b13" gracePeriod=30 Mar 20 14:55:39 crc kubenswrapper[4764]: I0320 14:55:39.770397 4764 generic.go:334] "Generic (PLEG): container finished" podID="0099134f-b810-4235-8d81-a0dfba45f3b9" containerID="e55e344eacddcb4407d46a6d4f9ff8b3b0d7a24b57932227e6bf0a3ce85d5b13" exitCode=0 Mar 20 14:55:39 crc kubenswrapper[4764]: I0320 14:55:39.770466 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" event={"ID":"0099134f-b810-4235-8d81-a0dfba45f3b9","Type":"ContainerDied","Data":"e55e344eacddcb4407d46a6d4f9ff8b3b0d7a24b57932227e6bf0a3ce85d5b13"} Mar 20 14:55:39 crc kubenswrapper[4764]: I0320 14:55:39.773158 4764 generic.go:334] "Generic (PLEG): container finished" podID="4b584f21-f79c-4bd6-937e-becba3cc91e6" containerID="976f98778422af3918d1d9a7060837e2e77955515bc40e062dc04cb0b24b3d94" exitCode=0 Mar 20 14:55:39 crc kubenswrapper[4764]: I0320 14:55:39.773219 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" event={"ID":"4b584f21-f79c-4bd6-937e-becba3cc91e6","Type":"ContainerDied","Data":"976f98778422af3918d1d9a7060837e2e77955515bc40e062dc04cb0b24b3d94"} Mar 20 14:55:39 crc kubenswrapper[4764]: I0320 14:55:39.774968 4764 generic.go:334] "Generic (PLEG): container finished" podID="0aad2eb4-a84b-41f9-88bf-c306f3735b78" containerID="03556fdd569626c51f057e15533f343ba2ba651bacd84e135919ff9fba579e2c" exitCode=0 Mar 20 14:55:39 crc kubenswrapper[4764]: I0320 14:55:39.774996 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0aad2eb4-a84b-41f9-88bf-c306f3735b78","Type":"ContainerDied","Data":"03556fdd569626c51f057e15533f343ba2ba651bacd84e135919ff9fba579e2c"} Mar 20 14:55:42 crc kubenswrapper[4764]: I0320 14:55:42.391085 4764 patch_prober.go:28] interesting pod/route-controller-manager-7698bbdc48-bmtzv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:55:42 crc kubenswrapper[4764]: I0320 14:55:42.391575 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" podUID="0099134f-b810-4235-8d81-a0dfba45f3b9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.456953 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kctmb"] Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.657191 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.657231 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.786720 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.792109 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.810263 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.810767 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0aad2eb4-a84b-41f9-88bf-c306f3735b78","Type":"ContainerDied","Data":"df4ca9445031c6c4dc1e4dbde1fc55f60bce7aca2ef4d435c13cb48b1d3a8625"} Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.810805 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df4ca9445031c6c4dc1e4dbde1fc55f60bce7aca2ef4d435c13cb48b1d3a8625" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.810859 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.812704 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.812881 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv" event={"ID":"0099134f-b810-4235-8d81-a0dfba45f3b9","Type":"ContainerDied","Data":"0046369694da55adc0651d58c598129e6ee5eb2fa2c021d10de0df248487c697"} Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.812923 4764 scope.go:117] "RemoveContainer" containerID="e55e344eacddcb4407d46a6d4f9ff8b3b0d7a24b57932227e6bf0a3ce85d5b13" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.815742 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" event={"ID":"4b584f21-f79c-4bd6-937e-becba3cc91e6","Type":"ContainerDied","Data":"457fd46deefd1e119c6a0b5fd3a5218f8f8525bdf5a51852fc7d427218dd8d4a"} Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.815810 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bdb574b79-7vfgz" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.843959 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.860273 4764 scope.go:117] "RemoveContainer" containerID="976f98778422af3918d1d9a7060837e2e77955515bc40e062dc04cb0b24b3d94" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.882597 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898010 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b584f21-f79c-4bd6-937e-becba3cc91e6-serving-cert\") pod \"4b584f21-f79c-4bd6-937e-becba3cc91e6\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898043 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aad2eb4-a84b-41f9-88bf-c306f3735b78-kube-api-access\") pod \"0aad2eb4-a84b-41f9-88bf-c306f3735b78\" (UID: \"0aad2eb4-a84b-41f9-88bf-c306f3735b78\") " Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898064 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-proxy-ca-bundles\") pod \"4b584f21-f79c-4bd6-937e-becba3cc91e6\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898085 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-config\") pod \"4b584f21-f79c-4bd6-937e-becba3cc91e6\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898105 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0099134f-b810-4235-8d81-a0dfba45f3b9-client-ca\") pod \"0099134f-b810-4235-8d81-a0dfba45f3b9\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898126 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0099134f-b810-4235-8d81-a0dfba45f3b9-config\") pod \"0099134f-b810-4235-8d81-a0dfba45f3b9\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898146 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0aad2eb4-a84b-41f9-88bf-c306f3735b78-kubelet-dir\") pod \"0aad2eb4-a84b-41f9-88bf-c306f3735b78\" (UID: \"0aad2eb4-a84b-41f9-88bf-c306f3735b78\") " Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898186 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0099134f-b810-4235-8d81-a0dfba45f3b9-serving-cert\") pod \"0099134f-b810-4235-8d81-a0dfba45f3b9\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898212 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gq8w\" (UniqueName: \"kubernetes.io/projected/4b584f21-f79c-4bd6-937e-becba3cc91e6-kube-api-access-4gq8w\") pod \"4b584f21-f79c-4bd6-937e-becba3cc91e6\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898235 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm9bw\" (UniqueName: \"kubernetes.io/projected/0099134f-b810-4235-8d81-a0dfba45f3b9-kube-api-access-sm9bw\") pod \"0099134f-b810-4235-8d81-a0dfba45f3b9\" (UID: \"0099134f-b810-4235-8d81-a0dfba45f3b9\") " Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898259 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-client-ca\") pod \"4b584f21-f79c-4bd6-937e-becba3cc91e6\" (UID: \"4b584f21-f79c-4bd6-937e-becba3cc91e6\") " Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898314 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0aad2eb4-a84b-41f9-88bf-c306f3735b78-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0aad2eb4-a84b-41f9-88bf-c306f3735b78" (UID: "0aad2eb4-a84b-41f9-88bf-c306f3735b78"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898482 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0aad2eb4-a84b-41f9-88bf-c306f3735b78-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.898922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0099134f-b810-4235-8d81-a0dfba45f3b9-client-ca" (OuterVolumeSpecName: "client-ca") pod "0099134f-b810-4235-8d81-a0dfba45f3b9" (UID: "0099134f-b810-4235-8d81-a0dfba45f3b9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.899002 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0099134f-b810-4235-8d81-a0dfba45f3b9-config" (OuterVolumeSpecName: "config") pod "0099134f-b810-4235-8d81-a0dfba45f3b9" (UID: "0099134f-b810-4235-8d81-a0dfba45f3b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.899771 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-config" (OuterVolumeSpecName: "config") pod "4b584f21-f79c-4bd6-937e-becba3cc91e6" (UID: "4b584f21-f79c-4bd6-937e-becba3cc91e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.900274 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4b584f21-f79c-4bd6-937e-becba3cc91e6" (UID: "4b584f21-f79c-4bd6-937e-becba3cc91e6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.900470 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-client-ca" (OuterVolumeSpecName: "client-ca") pod "4b584f21-f79c-4bd6-937e-becba3cc91e6" (UID: "4b584f21-f79c-4bd6-937e-becba3cc91e6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.905624 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0099134f-b810-4235-8d81-a0dfba45f3b9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0099134f-b810-4235-8d81-a0dfba45f3b9" (UID: "0099134f-b810-4235-8d81-a0dfba45f3b9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.905643 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aad2eb4-a84b-41f9-88bf-c306f3735b78-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0aad2eb4-a84b-41f9-88bf-c306f3735b78" (UID: "0aad2eb4-a84b-41f9-88bf-c306f3735b78"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.905720 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b584f21-f79c-4bd6-937e-becba3cc91e6-kube-api-access-4gq8w" (OuterVolumeSpecName: "kube-api-access-4gq8w") pod "4b584f21-f79c-4bd6-937e-becba3cc91e6" (UID: "4b584f21-f79c-4bd6-937e-becba3cc91e6"). InnerVolumeSpecName "kube-api-access-4gq8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.906623 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b584f21-f79c-4bd6-937e-becba3cc91e6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4b584f21-f79c-4bd6-937e-becba3cc91e6" (UID: "4b584f21-f79c-4bd6-937e-becba3cc91e6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.918505 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0099134f-b810-4235-8d81-a0dfba45f3b9-kube-api-access-sm9bw" (OuterVolumeSpecName: "kube-api-access-sm9bw") pod "0099134f-b810-4235-8d81-a0dfba45f3b9" (UID: "0099134f-b810-4235-8d81-a0dfba45f3b9"). InnerVolumeSpecName "kube-api-access-sm9bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.999252 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm9bw\" (UniqueName: \"kubernetes.io/projected/0099134f-b810-4235-8d81-a0dfba45f3b9-kube-api-access-sm9bw\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.999284 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.999295 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b584f21-f79c-4bd6-937e-becba3cc91e6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.999304 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0aad2eb4-a84b-41f9-88bf-c306f3735b78-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.999314 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.999323 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b584f21-f79c-4bd6-937e-becba3cc91e6-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.999334 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0099134f-b810-4235-8d81-a0dfba45f3b9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.999342 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0099134f-b810-4235-8d81-a0dfba45f3b9-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.999350 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0099134f-b810-4235-8d81-a0dfba45f3b9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:43 crc kubenswrapper[4764]: I0320 14:55:43.999359 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gq8w\" (UniqueName: \"kubernetes.io/projected/4b584f21-f79c-4bd6-937e-becba3cc91e6-kube-api-access-4gq8w\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.076764 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tmdd7"] Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.159263 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv"] Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.163338 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7698bbdc48-bmtzv"] Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.170586 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bdb574b79-7vfgz"] Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.173978 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bdb574b79-7vfgz"] Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.360445 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 14:55:44 crc kubenswrapper[4764]: E0320 14:55:44.360630 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0099134f-b810-4235-8d81-a0dfba45f3b9" containerName="route-controller-manager" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.360641 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0099134f-b810-4235-8d81-a0dfba45f3b9" containerName="route-controller-manager" Mar 20 14:55:44 crc kubenswrapper[4764]: E0320 14:55:44.360652 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aad2eb4-a84b-41f9-88bf-c306f3735b78" containerName="pruner" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.360658 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aad2eb4-a84b-41f9-88bf-c306f3735b78" containerName="pruner" Mar 20 14:55:44 crc kubenswrapper[4764]: E0320 14:55:44.360666 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f3afda6-923c-403a-994d-996da0ad0fee" containerName="oc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.360671 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3afda6-923c-403a-994d-996da0ad0fee" containerName="oc" Mar 20 14:55:44 crc kubenswrapper[4764]: E0320 14:55:44.360686 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b584f21-f79c-4bd6-937e-becba3cc91e6" containerName="controller-manager" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.360692 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b584f21-f79c-4bd6-937e-becba3cc91e6" containerName="controller-manager" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.360778 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f3afda6-923c-403a-994d-996da0ad0fee" containerName="oc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.360792 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b584f21-f79c-4bd6-937e-becba3cc91e6" containerName="controller-manager" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.360799 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aad2eb4-a84b-41f9-88bf-c306f3735b78" containerName="pruner" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.360812 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0099134f-b810-4235-8d81-a0dfba45f3b9" containerName="route-controller-manager" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.361130 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.363972 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.366288 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.386244 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.410044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d4da6599-c529-49c6-a409-e6abdec42a79-var-lock\") pod \"installer-9-crc\" (UID: \"d4da6599-c529-49c6-a409-e6abdec42a79\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.410089 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4da6599-c529-49c6-a409-e6abdec42a79-kube-api-access\") pod \"installer-9-crc\" (UID: \"d4da6599-c529-49c6-a409-e6abdec42a79\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.410172 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4da6599-c529-49c6-a409-e6abdec42a79-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d4da6599-c529-49c6-a409-e6abdec42a79\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.511196 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d4da6599-c529-49c6-a409-e6abdec42a79-var-lock\") pod \"installer-9-crc\" (UID: \"d4da6599-c529-49c6-a409-e6abdec42a79\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.511239 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4da6599-c529-49c6-a409-e6abdec42a79-kube-api-access\") pod \"installer-9-crc\" (UID: \"d4da6599-c529-49c6-a409-e6abdec42a79\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.511261 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4da6599-c529-49c6-a409-e6abdec42a79-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d4da6599-c529-49c6-a409-e6abdec42a79\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.511322 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4da6599-c529-49c6-a409-e6abdec42a79-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d4da6599-c529-49c6-a409-e6abdec42a79\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.511358 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d4da6599-c529-49c6-a409-e6abdec42a79-var-lock\") pod \"installer-9-crc\" (UID: \"d4da6599-c529-49c6-a409-e6abdec42a79\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.529881 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4da6599-c529-49c6-a409-e6abdec42a79-kube-api-access\") pod \"installer-9-crc\" (UID: \"d4da6599-c529-49c6-a409-e6abdec42a79\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.677260 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.823542 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7m8j" event={"ID":"548554c3-21d2-4406-a509-e80303628f56","Type":"ContainerStarted","Data":"87888cbf290e8781fd7ca3233608cf4f80726b26ac2407ed8728ef03c2f93090"} Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.827499 4764 generic.go:334] "Generic (PLEG): container finished" podID="f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" containerID="bc37799e99e3de244e086b25fdcd4eea98c6126a5f743861795ec69ea250e970" exitCode=0 Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.827556 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4llk9" event={"ID":"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b","Type":"ContainerDied","Data":"bc37799e99e3de244e086b25fdcd4eea98c6126a5f743861795ec69ea250e970"} Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.834648 4764 generic.go:334] "Generic (PLEG): container finished" podID="4a051746-92b7-4a16-a641-d73888dcfcca" containerID="578fa9430a1ab0030c466710bf7cc43c73d5be74517d9d892f7b1a0e9487eeda" exitCode=0 Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.834894 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzfpf" event={"ID":"4a051746-92b7-4a16-a641-d73888dcfcca","Type":"ContainerDied","Data":"578fa9430a1ab0030c466710bf7cc43c73d5be74517d9d892f7b1a0e9487eeda"} Mar 20 14:55:44 crc kubenswrapper[4764]: I0320 14:55:44.868967 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 14:55:45 crc kubenswrapper[4764]: I0320 14:55:45.136302 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0099134f-b810-4235-8d81-a0dfba45f3b9" path="/var/lib/kubelet/pods/0099134f-b810-4235-8d81-a0dfba45f3b9/volumes" Mar 20 14:55:45 crc kubenswrapper[4764]: I0320 14:55:45.137971 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b584f21-f79c-4bd6-937e-becba3cc91e6" path="/var/lib/kubelet/pods/4b584f21-f79c-4bd6-937e-becba3cc91e6/volumes" Mar 20 14:55:45 crc kubenswrapper[4764]: I0320 14:55:45.846096 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d4da6599-c529-49c6-a409-e6abdec42a79","Type":"ContainerStarted","Data":"aaf70e68004ef94b046b3ba3c602025d765d74d331634e98a1fae16ba277966b"} Mar 20 14:55:45 crc kubenswrapper[4764]: I0320 14:55:45.846435 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d4da6599-c529-49c6-a409-e6abdec42a79","Type":"ContainerStarted","Data":"5b5da596ef192c47caedb9a9e301083fdcb5ea10306b10150ba2943b470b4eba"} Mar 20 14:55:45 crc kubenswrapper[4764]: I0320 14:55:45.849783 4764 generic.go:334] "Generic (PLEG): container finished" podID="548554c3-21d2-4406-a509-e80303628f56" containerID="87888cbf290e8781fd7ca3233608cf4f80726b26ac2407ed8728ef03c2f93090" exitCode=0 Mar 20 14:55:45 crc kubenswrapper[4764]: I0320 14:55:45.849881 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7m8j" event={"ID":"548554c3-21d2-4406-a509-e80303628f56","Type":"ContainerDied","Data":"87888cbf290e8781fd7ca3233608cf4f80726b26ac2407ed8728ef03c2f93090"} Mar 20 14:55:45 crc kubenswrapper[4764]: I0320 14:55:45.853408 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4llk9" event={"ID":"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b","Type":"ContainerStarted","Data":"81b922502c72275a6f64c96c7489d36e5732e8449227993f72155b679f50a035"} Mar 20 14:55:45 crc kubenswrapper[4764]: I0320 14:55:45.856229 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzfpf" event={"ID":"4a051746-92b7-4a16-a641-d73888dcfcca","Type":"ContainerStarted","Data":"3942a276f8cb67ec8c9ec580140afcfecb1fe16ad139d8551771d3167d0896f5"} Mar 20 14:55:45 crc kubenswrapper[4764]: I0320 14:55:45.856374 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tmdd7" podUID="a8776b7a-9a4d-41a5-a022-701f97953a5f" containerName="registry-server" containerID="cri-o://c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce" gracePeriod=2 Mar 20 14:55:45 crc kubenswrapper[4764]: I0320 14:55:45.897695 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.89767582 podStartE2EDuration="1.89767582s" podCreationTimestamp="2026-03-20 14:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:55:45.870980279 +0000 UTC m=+267.487169408" watchObservedRunningTime="2026-03-20 14:55:45.89767582 +0000 UTC m=+267.513864949" Mar 20 14:55:45 crc kubenswrapper[4764]: I0320 14:55:45.898169 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4llk9" podStartSLOduration=30.058017842 podStartE2EDuration="41.898164808s" podCreationTimestamp="2026-03-20 14:55:04 +0000 UTC" firstStartedPulling="2026-03-20 14:55:33.650600094 +0000 UTC m=+255.266789223" lastFinishedPulling="2026-03-20 14:55:45.490747 +0000 UTC m=+267.106936189" observedRunningTime="2026-03-20 14:55:45.894295299 +0000 UTC m=+267.510484428" watchObservedRunningTime="2026-03-20 14:55:45.898164808 +0000 UTC m=+267.514353937" Mar 20 14:55:45 crc kubenswrapper[4764]: I0320 14:55:45.929114 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fzfpf" podStartSLOduration=31.024526716 podStartE2EDuration="42.92909916s" podCreationTimestamp="2026-03-20 14:55:03 +0000 UTC" firstStartedPulling="2026-03-20 14:55:33.657417584 +0000 UTC m=+255.273606723" lastFinishedPulling="2026-03-20 14:55:45.561990038 +0000 UTC m=+267.178179167" observedRunningTime="2026-03-20 14:55:45.925814903 +0000 UTC m=+267.542004062" watchObservedRunningTime="2026-03-20 14:55:45.92909916 +0000 UTC m=+267.545288289" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.205291 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.244406 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8776b7a-9a4d-41a5-a022-701f97953a5f-utilities\") pod \"a8776b7a-9a4d-41a5-a022-701f97953a5f\" (UID: \"a8776b7a-9a4d-41a5-a022-701f97953a5f\") " Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.244505 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpbp8\" (UniqueName: \"kubernetes.io/projected/a8776b7a-9a4d-41a5-a022-701f97953a5f-kube-api-access-lpbp8\") pod \"a8776b7a-9a4d-41a5-a022-701f97953a5f\" (UID: \"a8776b7a-9a4d-41a5-a022-701f97953a5f\") " Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.244533 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8776b7a-9a4d-41a5-a022-701f97953a5f-catalog-content\") pod \"a8776b7a-9a4d-41a5-a022-701f97953a5f\" (UID: \"a8776b7a-9a4d-41a5-a022-701f97953a5f\") " Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.249495 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8776b7a-9a4d-41a5-a022-701f97953a5f-kube-api-access-lpbp8" (OuterVolumeSpecName: "kube-api-access-lpbp8") pod "a8776b7a-9a4d-41a5-a022-701f97953a5f" (UID: "a8776b7a-9a4d-41a5-a022-701f97953a5f"). InnerVolumeSpecName "kube-api-access-lpbp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.260285 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8776b7a-9a4d-41a5-a022-701f97953a5f-utilities" (OuterVolumeSpecName: "utilities") pod "a8776b7a-9a4d-41a5-a022-701f97953a5f" (UID: "a8776b7a-9a4d-41a5-a022-701f97953a5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.279248 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8776b7a-9a4d-41a5-a022-701f97953a5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8776b7a-9a4d-41a5-a022-701f97953a5f" (UID: "a8776b7a-9a4d-41a5-a022-701f97953a5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.345649 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8776b7a-9a4d-41a5-a022-701f97953a5f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.345687 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpbp8\" (UniqueName: \"kubernetes.io/projected/a8776b7a-9a4d-41a5-a022-701f97953a5f-kube-api-access-lpbp8\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.345717 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8776b7a-9a4d-41a5-a022-701f97953a5f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.588613 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66957cb65f-vr58f"] Mar 20 14:55:46 crc kubenswrapper[4764]: E0320 14:55:46.588877 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8776b7a-9a4d-41a5-a022-701f97953a5f" containerName="extract-utilities" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.588892 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8776b7a-9a4d-41a5-a022-701f97953a5f" containerName="extract-utilities" Mar 20 14:55:46 crc kubenswrapper[4764]: E0320 14:55:46.588915 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8776b7a-9a4d-41a5-a022-701f97953a5f" containerName="extract-content" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.588923 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8776b7a-9a4d-41a5-a022-701f97953a5f" containerName="extract-content" Mar 20 14:55:46 crc kubenswrapper[4764]: E0320 14:55:46.588935 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8776b7a-9a4d-41a5-a022-701f97953a5f" containerName="registry-server" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.588940 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8776b7a-9a4d-41a5-a022-701f97953a5f" containerName="registry-server" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.589039 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8776b7a-9a4d-41a5-a022-701f97953a5f" containerName="registry-server" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.589467 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.592434 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz"] Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.592931 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.593201 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.593461 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.593467 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.593533 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.594017 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.598758 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.598789 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.598871 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.599017 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.599903 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.599927 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.602921 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.603719 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.605623 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz"] Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.613713 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66957cb65f-vr58f"] Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.649531 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-proxy-ca-bundles\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.649616 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa925f-5516-4cba-82e6-098769e2d405-config\") pod \"route-controller-manager-58cf65c484-zw2jz\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.649729 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldbrl\" (UniqueName: \"kubernetes.io/projected/5518b1eb-a8ba-4239-ab7f-a4426779168d-kube-api-access-ldbrl\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.649786 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9k6w\" (UniqueName: \"kubernetes.io/projected/b6aa925f-5516-4cba-82e6-098769e2d405-kube-api-access-h9k6w\") pod \"route-controller-manager-58cf65c484-zw2jz\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.649808 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5518b1eb-a8ba-4239-ab7f-a4426779168d-serving-cert\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.650099 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aa925f-5516-4cba-82e6-098769e2d405-client-ca\") pod \"route-controller-manager-58cf65c484-zw2jz\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.650198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-config\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.650265 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-client-ca\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.650318 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aa925f-5516-4cba-82e6-098769e2d405-serving-cert\") pod \"route-controller-manager-58cf65c484-zw2jz\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.751392 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa925f-5516-4cba-82e6-098769e2d405-config\") pod \"route-controller-manager-58cf65c484-zw2jz\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.751440 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldbrl\" (UniqueName: \"kubernetes.io/projected/5518b1eb-a8ba-4239-ab7f-a4426779168d-kube-api-access-ldbrl\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.751458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9k6w\" (UniqueName: \"kubernetes.io/projected/b6aa925f-5516-4cba-82e6-098769e2d405-kube-api-access-h9k6w\") pod \"route-controller-manager-58cf65c484-zw2jz\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.751475 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5518b1eb-a8ba-4239-ab7f-a4426779168d-serving-cert\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.751523 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aa925f-5516-4cba-82e6-098769e2d405-client-ca\") pod \"route-controller-manager-58cf65c484-zw2jz\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.751555 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-config\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.752576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aa925f-5516-4cba-82e6-098769e2d405-client-ca\") pod \"route-controller-manager-58cf65c484-zw2jz\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.752827 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-config\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.751578 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-client-ca\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.752900 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aa925f-5516-4cba-82e6-098769e2d405-serving-cert\") pod \"route-controller-manager-58cf65c484-zw2jz\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.752918 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-proxy-ca-bundles\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.752965 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-client-ca\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.753027 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa925f-5516-4cba-82e6-098769e2d405-config\") pod \"route-controller-manager-58cf65c484-zw2jz\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.754178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-proxy-ca-bundles\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.754964 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5518b1eb-a8ba-4239-ab7f-a4426779168d-serving-cert\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.756807 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aa925f-5516-4cba-82e6-098769e2d405-serving-cert\") pod \"route-controller-manager-58cf65c484-zw2jz\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.768039 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldbrl\" (UniqueName: \"kubernetes.io/projected/5518b1eb-a8ba-4239-ab7f-a4426779168d-kube-api-access-ldbrl\") pod \"controller-manager-66957cb65f-vr58f\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.771031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9k6w\" (UniqueName: \"kubernetes.io/projected/b6aa925f-5516-4cba-82e6-098769e2d405-kube-api-access-h9k6w\") pod \"route-controller-manager-58cf65c484-zw2jz\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.862728 4764 generic.go:334] "Generic (PLEG): container finished" podID="67e76e77-4199-4fdd-b755-10cab62e1370" containerID="0fd969413f12cbe838fb83565a38f1f771fdbf84968956688e03cf62ebf643be" exitCode=0 Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.862814 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmxv2" event={"ID":"67e76e77-4199-4fdd-b755-10cab62e1370","Type":"ContainerDied","Data":"0fd969413f12cbe838fb83565a38f1f771fdbf84968956688e03cf62ebf643be"} Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.864333 4764 generic.go:334] "Generic (PLEG): container finished" podID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" containerID="bb0959f89b57b0aa7a64d245fd317edc0911a289f860eb721d74ebdfc62a3283" exitCode=0 Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.864474 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nphvv" event={"ID":"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc","Type":"ContainerDied","Data":"bb0959f89b57b0aa7a64d245fd317edc0911a289f860eb721d74ebdfc62a3283"} Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.867248 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8776b7a-9a4d-41a5-a022-701f97953a5f" containerID="c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce" exitCode=0 Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.867302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tmdd7" event={"ID":"a8776b7a-9a4d-41a5-a022-701f97953a5f","Type":"ContainerDied","Data":"c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce"} Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.867321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tmdd7" event={"ID":"a8776b7a-9a4d-41a5-a022-701f97953a5f","Type":"ContainerDied","Data":"c69e6c4bd6f02410db897a0f6478f21fbfa40a6b937491d96f2e17c15df6d02b"} Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.867322 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tmdd7" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.867337 4764 scope.go:117] "RemoveContainer" containerID="c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.873726 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7m8j" event={"ID":"548554c3-21d2-4406-a509-e80303628f56","Type":"ContainerStarted","Data":"59150e1e0d63536c0f51e2f90ac24585b123050217841fefc6b6cebc3b3e6c70"} Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.881464 4764 scope.go:117] "RemoveContainer" containerID="e70a4d5ec8cb22a044089c851cd83dbadc2b34fac3f6226b99a465a98a2ca5b6" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.910313 4764 scope.go:117] "RemoveContainer" containerID="095b8cb1e49d14aa303847414ed9492cbbf9c54e57840491ff9ca256303c7f87" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.914093 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.925918 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f7m8j" podStartSLOduration=2.960057147 podStartE2EDuration="46.92590036s" podCreationTimestamp="2026-03-20 14:55:00 +0000 UTC" firstStartedPulling="2026-03-20 14:55:02.337514983 +0000 UTC m=+223.953704112" lastFinishedPulling="2026-03-20 14:55:46.303358196 +0000 UTC m=+267.919547325" observedRunningTime="2026-03-20 14:55:46.905756552 +0000 UTC m=+268.521945691" watchObservedRunningTime="2026-03-20 14:55:46.92590036 +0000 UTC m=+268.542089499" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.933836 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.934998 4764 scope.go:117] "RemoveContainer" containerID="c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce" Mar 20 14:55:46 crc kubenswrapper[4764]: E0320 14:55:46.943683 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce\": container with ID starting with c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce not found: ID does not exist" containerID="c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.943720 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce"} err="failed to get container status \"c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce\": rpc error: code = NotFound desc = could not find container \"c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce\": container with ID starting with c84a0aa69150068de769b606ea7bb235810aeb363802c6b7a001d9b1c8f7d8ce not found: ID does not exist" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.943744 4764 scope.go:117] "RemoveContainer" containerID="e70a4d5ec8cb22a044089c851cd83dbadc2b34fac3f6226b99a465a98a2ca5b6" Mar 20 14:55:46 crc kubenswrapper[4764]: E0320 14:55:46.944159 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70a4d5ec8cb22a044089c851cd83dbadc2b34fac3f6226b99a465a98a2ca5b6\": container with ID starting with e70a4d5ec8cb22a044089c851cd83dbadc2b34fac3f6226b99a465a98a2ca5b6 not found: ID does not exist" containerID="e70a4d5ec8cb22a044089c851cd83dbadc2b34fac3f6226b99a465a98a2ca5b6" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.944181 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70a4d5ec8cb22a044089c851cd83dbadc2b34fac3f6226b99a465a98a2ca5b6"} err="failed to get container status \"e70a4d5ec8cb22a044089c851cd83dbadc2b34fac3f6226b99a465a98a2ca5b6\": rpc error: code = NotFound desc = could not find container \"e70a4d5ec8cb22a044089c851cd83dbadc2b34fac3f6226b99a465a98a2ca5b6\": container with ID starting with e70a4d5ec8cb22a044089c851cd83dbadc2b34fac3f6226b99a465a98a2ca5b6 not found: ID does not exist" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.944196 4764 scope.go:117] "RemoveContainer" containerID="095b8cb1e49d14aa303847414ed9492cbbf9c54e57840491ff9ca256303c7f87" Mar 20 14:55:46 crc kubenswrapper[4764]: E0320 14:55:46.944723 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"095b8cb1e49d14aa303847414ed9492cbbf9c54e57840491ff9ca256303c7f87\": container with ID starting with 095b8cb1e49d14aa303847414ed9492cbbf9c54e57840491ff9ca256303c7f87 not found: ID does not exist" containerID="095b8cb1e49d14aa303847414ed9492cbbf9c54e57840491ff9ca256303c7f87" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.944743 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"095b8cb1e49d14aa303847414ed9492cbbf9c54e57840491ff9ca256303c7f87"} err="failed to get container status \"095b8cb1e49d14aa303847414ed9492cbbf9c54e57840491ff9ca256303c7f87\": rpc error: code = NotFound desc = could not find container \"095b8cb1e49d14aa303847414ed9492cbbf9c54e57840491ff9ca256303c7f87\": container with ID starting with 095b8cb1e49d14aa303847414ed9492cbbf9c54e57840491ff9ca256303c7f87 not found: ID does not exist" Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.949439 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tmdd7"] Mar 20 14:55:46 crc kubenswrapper[4764]: I0320 14:55:46.951511 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tmdd7"] Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.134989 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8776b7a-9a4d-41a5-a022-701f97953a5f" path="/var/lib/kubelet/pods/a8776b7a-9a4d-41a5-a022-701f97953a5f/volumes" Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.179979 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66957cb65f-vr58f"] Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.222294 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz"] Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.880349 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmxv2" event={"ID":"67e76e77-4199-4fdd-b755-10cab62e1370","Type":"ContainerStarted","Data":"e90443574f4f719172f02718115464a18cce41db48c9e44c1aa44193f1c468a8"} Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.882308 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nphvv" event={"ID":"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc","Type":"ContainerStarted","Data":"adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534"} Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.885246 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44fr6" event={"ID":"830768c7-49e2-4ed5-af8e-3762dc00534e","Type":"ContainerStarted","Data":"ca75eb84c7a50dfc81576a068eb0eaea7a16ab0d5b41cd44978dc3eb392e4d53"} Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.886372 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" event={"ID":"b6aa925f-5516-4cba-82e6-098769e2d405","Type":"ContainerStarted","Data":"815f86b1344910e974aaf5cda864d91105d9ed7454d8cee0988e8a70b8e8ca6c"} Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.886418 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" event={"ID":"b6aa925f-5516-4cba-82e6-098769e2d405","Type":"ContainerStarted","Data":"300eb1a08ab9f9f25440a4e4f95b19047b7bd43c93d72aae0ca0f4182252db63"} Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.886591 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.887536 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" event={"ID":"5518b1eb-a8ba-4239-ab7f-a4426779168d","Type":"ContainerStarted","Data":"512fb6f5954812756852b19f8551094aec6f7b66e70916ef6dd946fb8449a09f"} Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.887559 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" event={"ID":"5518b1eb-a8ba-4239-ab7f-a4426779168d","Type":"ContainerStarted","Data":"1242e17870c44166327c2c10e367efec7be339cdc3f22fdaa7400cabe57945ce"} Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.887704 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.893861 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.900079 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cmxv2" podStartSLOduration=2.859967265 podStartE2EDuration="47.900064784s" podCreationTimestamp="2026-03-20 14:55:00 +0000 UTC" firstStartedPulling="2026-03-20 14:55:02.319261587 +0000 UTC m=+223.935450716" lastFinishedPulling="2026-03-20 14:55:47.359359106 +0000 UTC m=+268.975548235" observedRunningTime="2026-03-20 14:55:47.898324432 +0000 UTC m=+269.514513561" watchObservedRunningTime="2026-03-20 14:55:47.900064784 +0000 UTC m=+269.516253913" Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.921000 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" podStartSLOduration=8.920986769 podStartE2EDuration="8.920986769s" podCreationTimestamp="2026-03-20 14:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:55:47.917405552 +0000 UTC m=+269.533594681" watchObservedRunningTime="2026-03-20 14:55:47.920986769 +0000 UTC m=+269.537175898" Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.937287 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nphvv" podStartSLOduration=1.962400845 podStartE2EDuration="46.93727145s" podCreationTimestamp="2026-03-20 14:55:01 +0000 UTC" firstStartedPulling="2026-03-20 14:55:02.318914344 +0000 UTC m=+223.935103473" lastFinishedPulling="2026-03-20 14:55:47.293784949 +0000 UTC m=+268.909974078" observedRunningTime="2026-03-20 14:55:47.934159899 +0000 UTC m=+269.550349028" watchObservedRunningTime="2026-03-20 14:55:47.93727145 +0000 UTC m=+269.553460579" Mar 20 14:55:47 crc kubenswrapper[4764]: I0320 14:55:47.954135 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" podStartSLOduration=8.95411491 podStartE2EDuration="8.95411491s" podCreationTimestamp="2026-03-20 14:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:55:47.952750951 +0000 UTC m=+269.568940080" watchObservedRunningTime="2026-03-20 14:55:47.95411491 +0000 UTC m=+269.570304039" Mar 20 14:55:48 crc kubenswrapper[4764]: I0320 14:55:48.120711 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:55:48 crc kubenswrapper[4764]: I0320 14:55:48.895794 4764 generic.go:334] "Generic (PLEG): container finished" podID="830768c7-49e2-4ed5-af8e-3762dc00534e" containerID="ca75eb84c7a50dfc81576a068eb0eaea7a16ab0d5b41cd44978dc3eb392e4d53" exitCode=0 Mar 20 14:55:48 crc kubenswrapper[4764]: I0320 14:55:48.895893 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44fr6" event={"ID":"830768c7-49e2-4ed5-af8e-3762dc00534e","Type":"ContainerDied","Data":"ca75eb84c7a50dfc81576a068eb0eaea7a16ab0d5b41cd44978dc3eb392e4d53"} Mar 20 14:55:51 crc kubenswrapper[4764]: I0320 14:55:51.118116 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:51 crc kubenswrapper[4764]: I0320 14:55:51.118436 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:51 crc kubenswrapper[4764]: I0320 14:55:51.168755 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:55:51 crc kubenswrapper[4764]: I0320 14:55:51.322422 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:51 crc kubenswrapper[4764]: I0320 14:55:51.322494 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:51 crc kubenswrapper[4764]: I0320 14:55:51.365845 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:51 crc kubenswrapper[4764]: I0320 14:55:51.809453 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:51 crc kubenswrapper[4764]: I0320 14:55:51.809520 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:51 crc kubenswrapper[4764]: I0320 14:55:51.853286 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:55:51 crc kubenswrapper[4764]: I0320 14:55:51.952100 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:55:52 crc kubenswrapper[4764]: I0320 14:55:52.919481 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44fr6" event={"ID":"830768c7-49e2-4ed5-af8e-3762dc00534e","Type":"ContainerStarted","Data":"5a074b1406ab1ac382f7073b289cc8c6fe55630339d2f797bfa5c15530b7afc5"} Mar 20 14:55:52 crc kubenswrapper[4764]: I0320 14:55:52.921477 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvnpx" event={"ID":"6990bd44-c839-4dcd-bb4a-8d9da96bf644","Type":"ContainerStarted","Data":"be7974a9741407b6bd3f047d7129d5e168680377f625c4b261328d2703c00ed4"} Mar 20 14:55:52 crc kubenswrapper[4764]: I0320 14:55:52.944809 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-44fr6" podStartSLOduration=9.887780132 podStartE2EDuration="50.944786719s" podCreationTimestamp="2026-03-20 14:55:02 +0000 UTC" firstStartedPulling="2026-03-20 14:55:11.24450517 +0000 UTC m=+232.860694299" lastFinishedPulling="2026-03-20 14:55:52.301511747 +0000 UTC m=+273.917700886" observedRunningTime="2026-03-20 14:55:52.941312435 +0000 UTC m=+274.557501554" watchObservedRunningTime="2026-03-20 14:55:52.944786719 +0000 UTC m=+274.560975878" Mar 20 14:55:53 crc kubenswrapper[4764]: I0320 14:55:53.317280 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:53 crc kubenswrapper[4764]: I0320 14:55:53.317670 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:55:53 crc kubenswrapper[4764]: I0320 14:55:53.927078 4764 generic.go:334] "Generic (PLEG): container finished" podID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" containerID="be7974a9741407b6bd3f047d7129d5e168680377f625c4b261328d2703c00ed4" exitCode=0 Mar 20 14:55:53 crc kubenswrapper[4764]: I0320 14:55:53.927770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvnpx" event={"ID":"6990bd44-c839-4dcd-bb4a-8d9da96bf644","Type":"ContainerDied","Data":"be7974a9741407b6bd3f047d7129d5e168680377f625c4b261328d2703c00ed4"} Mar 20 14:55:54 crc kubenswrapper[4764]: I0320 14:55:54.331689 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:54 crc kubenswrapper[4764]: I0320 14:55:54.331951 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:54 crc kubenswrapper[4764]: I0320 14:55:54.355331 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-44fr6" podUID="830768c7-49e2-4ed5-af8e-3762dc00534e" containerName="registry-server" probeResult="failure" output=< Mar 20 14:55:54 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 20 14:55:54 crc kubenswrapper[4764]: > Mar 20 14:55:54 crc kubenswrapper[4764]: I0320 14:55:54.376348 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:54 crc kubenswrapper[4764]: I0320 14:55:54.681429 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:54 crc kubenswrapper[4764]: I0320 14:55:54.681468 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:54 crc kubenswrapper[4764]: I0320 14:55:54.736452 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:54 crc kubenswrapper[4764]: I0320 14:55:54.969334 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:55 crc kubenswrapper[4764]: I0320 14:55:55.412839 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:55:57 crc kubenswrapper[4764]: I0320 14:55:55.939993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvnpx" event={"ID":"6990bd44-c839-4dcd-bb4a-8d9da96bf644","Type":"ContainerStarted","Data":"fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6"} Mar 20 14:55:57 crc kubenswrapper[4764]: I0320 14:55:55.961158 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hvnpx" podStartSLOduration=1.7339757709999999 podStartE2EDuration="54.961139886s" podCreationTimestamp="2026-03-20 14:55:01 +0000 UTC" firstStartedPulling="2026-03-20 14:55:02.33941993 +0000 UTC m=+223.955609059" lastFinishedPulling="2026-03-20 14:55:55.566584045 +0000 UTC m=+277.182773174" observedRunningTime="2026-03-20 14:55:55.958844533 +0000 UTC m=+277.575033662" watchObservedRunningTime="2026-03-20 14:55:55.961139886 +0000 UTC m=+277.577329005" Mar 20 14:55:58 crc kubenswrapper[4764]: I0320 14:55:58.071902 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4llk9"] Mar 20 14:55:58 crc kubenswrapper[4764]: I0320 14:55:58.072107 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4llk9" podUID="f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" containerName="registry-server" containerID="cri-o://81b922502c72275a6f64c96c7489d36e5732e8449227993f72155b679f50a035" gracePeriod=2 Mar 20 14:55:58 crc kubenswrapper[4764]: I0320 14:55:58.972256 4764 generic.go:334] "Generic (PLEG): container finished" podID="f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" containerID="81b922502c72275a6f64c96c7489d36e5732e8449227993f72155b679f50a035" exitCode=0 Mar 20 14:55:58 crc kubenswrapper[4764]: I0320 14:55:58.972368 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4llk9" event={"ID":"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b","Type":"ContainerDied","Data":"81b922502c72275a6f64c96c7489d36e5732e8449227993f72155b679f50a035"} Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.112565 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66957cb65f-vr58f"] Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.112816 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" podUID="5518b1eb-a8ba-4239-ab7f-a4426779168d" containerName="controller-manager" containerID="cri-o://512fb6f5954812756852b19f8551094aec6f7b66e70916ef6dd946fb8449a09f" gracePeriod=30 Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.120848 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz"] Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.121416 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" podUID="b6aa925f-5516-4cba-82e6-098769e2d405" containerName="route-controller-manager" containerID="cri-o://815f86b1344910e974aaf5cda864d91105d9ed7454d8cee0988e8a70b8e8ca6c" gracePeriod=30 Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.724259 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.742508 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-catalog-content\") pod \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\" (UID: \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\") " Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.742671 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-utilities\") pod \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\" (UID: \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\") " Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.742761 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n72pk\" (UniqueName: \"kubernetes.io/projected/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-kube-api-access-n72pk\") pod \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\" (UID: \"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b\") " Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.743830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-utilities" (OuterVolumeSpecName: "utilities") pod "f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" (UID: "f21ed6b8-e9f9-4d40-8700-77c6b3919a4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.744749 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.750399 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-kube-api-access-n72pk" (OuterVolumeSpecName: "kube-api-access-n72pk") pod "f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" (UID: "f21ed6b8-e9f9-4d40-8700-77c6b3919a4b"). InnerVolumeSpecName "kube-api-access-n72pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.846274 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n72pk\" (UniqueName: \"kubernetes.io/projected/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-kube-api-access-n72pk\") on node \"crc\" DevicePath \"\"" Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.979827 4764 generic.go:334] "Generic (PLEG): container finished" podID="b6aa925f-5516-4cba-82e6-098769e2d405" containerID="815f86b1344910e974aaf5cda864d91105d9ed7454d8cee0988e8a70b8e8ca6c" exitCode=0 Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.979908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" event={"ID":"b6aa925f-5516-4cba-82e6-098769e2d405","Type":"ContainerDied","Data":"815f86b1344910e974aaf5cda864d91105d9ed7454d8cee0988e8a70b8e8ca6c"} Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.981538 4764 generic.go:334] "Generic (PLEG): container finished" podID="5518b1eb-a8ba-4239-ab7f-a4426779168d" containerID="512fb6f5954812756852b19f8551094aec6f7b66e70916ef6dd946fb8449a09f" exitCode=0 Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.981607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" event={"ID":"5518b1eb-a8ba-4239-ab7f-a4426779168d","Type":"ContainerDied","Data":"512fb6f5954812756852b19f8551094aec6f7b66e70916ef6dd946fb8449a09f"} Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.983821 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4llk9" event={"ID":"f21ed6b8-e9f9-4d40-8700-77c6b3919a4b","Type":"ContainerDied","Data":"e0b342f4464ee6f811f4f7ded429693f3eba70521e85b27e91beca253282af2b"} Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.983858 4764 scope.go:117] "RemoveContainer" containerID="81b922502c72275a6f64c96c7489d36e5732e8449227993f72155b679f50a035" Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.983887 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4llk9" Mar 20 14:55:59 crc kubenswrapper[4764]: I0320 14:55:59.997799 4764 scope.go:117] "RemoveContainer" containerID="bc37799e99e3de244e086b25fdcd4eea98c6126a5f743861795ec69ea250e970" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.013357 4764 scope.go:117] "RemoveContainer" containerID="2bdc0a2a551c8d22b98cfe9b4eb4bb1bfd209dd40d6a6860c7e233b9449a4d02" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.114310 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" (UID: "f21ed6b8-e9f9-4d40-8700-77c6b3919a4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.140217 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566976-h6xpn"] Mar 20 14:56:00 crc kubenswrapper[4764]: E0320 14:56:00.140439 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" containerName="extract-utilities" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.140450 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" containerName="extract-utilities" Mar 20 14:56:00 crc kubenswrapper[4764]: E0320 14:56:00.140463 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" containerName="extract-content" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.140469 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" containerName="extract-content" Mar 20 14:56:00 crc kubenswrapper[4764]: E0320 14:56:00.140484 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" containerName="registry-server" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.140490 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" containerName="registry-server" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.140577 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" containerName="registry-server" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.140910 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566976-h6xpn" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.143067 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.143609 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.144061 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.149407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqzwj\" (UniqueName: \"kubernetes.io/projected/190d359f-af45-40db-a90e-91bc465e6e1f-kube-api-access-bqzwj\") pod \"auto-csr-approver-29566976-h6xpn\" (UID: \"190d359f-af45-40db-a90e-91bc465e6e1f\") " pod="openshift-infra/auto-csr-approver-29566976-h6xpn" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.149442 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566976-h6xpn"] Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.149517 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.250398 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqzwj\" (UniqueName: \"kubernetes.io/projected/190d359f-af45-40db-a90e-91bc465e6e1f-kube-api-access-bqzwj\") pod \"auto-csr-approver-29566976-h6xpn\" (UID: \"190d359f-af45-40db-a90e-91bc465e6e1f\") " pod="openshift-infra/auto-csr-approver-29566976-h6xpn" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.272936 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqzwj\" (UniqueName: \"kubernetes.io/projected/190d359f-af45-40db-a90e-91bc465e6e1f-kube-api-access-bqzwj\") pod \"auto-csr-approver-29566976-h6xpn\" (UID: \"190d359f-af45-40db-a90e-91bc465e6e1f\") " pod="openshift-infra/auto-csr-approver-29566976-h6xpn" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.312049 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4llk9"] Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.320829 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4llk9"] Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.486868 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566976-h6xpn" Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.947729 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566976-h6xpn"] Mar 20 14:56:00 crc kubenswrapper[4764]: I0320 14:56:00.996466 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566976-h6xpn" event={"ID":"190d359f-af45-40db-a90e-91bc465e6e1f","Type":"ContainerStarted","Data":"3dde5c03c1d2e10c1815cdf397342a1e721e8af02d514a7036ed3ebdf230a460"} Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.132569 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.136203 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f21ed6b8-e9f9-4d40-8700-77c6b3919a4b" path="/var/lib/kubelet/pods/f21ed6b8-e9f9-4d40-8700-77c6b3919a4b/volumes" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.145196 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.160907 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aa925f-5516-4cba-82e6-098769e2d405-client-ca\") pod \"b6aa925f-5516-4cba-82e6-098769e2d405\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.160970 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa925f-5516-4cba-82e6-098769e2d405-config\") pod \"b6aa925f-5516-4cba-82e6-098769e2d405\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.161034 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldbrl\" (UniqueName: \"kubernetes.io/projected/5518b1eb-a8ba-4239-ab7f-a4426779168d-kube-api-access-ldbrl\") pod \"5518b1eb-a8ba-4239-ab7f-a4426779168d\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.161094 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-client-ca\") pod \"5518b1eb-a8ba-4239-ab7f-a4426779168d\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.161127 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9k6w\" (UniqueName: \"kubernetes.io/projected/b6aa925f-5516-4cba-82e6-098769e2d405-kube-api-access-h9k6w\") pod \"b6aa925f-5516-4cba-82e6-098769e2d405\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.161156 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aa925f-5516-4cba-82e6-098769e2d405-serving-cert\") pod \"b6aa925f-5516-4cba-82e6-098769e2d405\" (UID: \"b6aa925f-5516-4cba-82e6-098769e2d405\") " Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.161194 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5518b1eb-a8ba-4239-ab7f-a4426779168d-serving-cert\") pod \"5518b1eb-a8ba-4239-ab7f-a4426779168d\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.161232 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-proxy-ca-bundles\") pod \"5518b1eb-a8ba-4239-ab7f-a4426779168d\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.161275 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-config\") pod \"5518b1eb-a8ba-4239-ab7f-a4426779168d\" (UID: \"5518b1eb-a8ba-4239-ab7f-a4426779168d\") " Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.161595 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aa925f-5516-4cba-82e6-098769e2d405-client-ca" (OuterVolumeSpecName: "client-ca") pod "b6aa925f-5516-4cba-82e6-098769e2d405" (UID: "b6aa925f-5516-4cba-82e6-098769e2d405"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.162600 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-client-ca" (OuterVolumeSpecName: "client-ca") pod "5518b1eb-a8ba-4239-ab7f-a4426779168d" (UID: "5518b1eb-a8ba-4239-ab7f-a4426779168d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.162642 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-config" (OuterVolumeSpecName: "config") pod "5518b1eb-a8ba-4239-ab7f-a4426779168d" (UID: "5518b1eb-a8ba-4239-ab7f-a4426779168d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.163024 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aa925f-5516-4cba-82e6-098769e2d405-config" (OuterVolumeSpecName: "config") pod "b6aa925f-5516-4cba-82e6-098769e2d405" (UID: "b6aa925f-5516-4cba-82e6-098769e2d405"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.168656 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6aa925f-5516-4cba-82e6-098769e2d405-kube-api-access-h9k6w" (OuterVolumeSpecName: "kube-api-access-h9k6w") pod "b6aa925f-5516-4cba-82e6-098769e2d405" (UID: "b6aa925f-5516-4cba-82e6-098769e2d405"). InnerVolumeSpecName "kube-api-access-h9k6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.173129 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5518b1eb-a8ba-4239-ab7f-a4426779168d" (UID: "5518b1eb-a8ba-4239-ab7f-a4426779168d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.177592 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6aa925f-5516-4cba-82e6-098769e2d405-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b6aa925f-5516-4cba-82e6-098769e2d405" (UID: "b6aa925f-5516-4cba-82e6-098769e2d405"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.178522 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5518b1eb-a8ba-4239-ab7f-a4426779168d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5518b1eb-a8ba-4239-ab7f-a4426779168d" (UID: "5518b1eb-a8ba-4239-ab7f-a4426779168d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.180309 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5518b1eb-a8ba-4239-ab7f-a4426779168d-kube-api-access-ldbrl" (OuterVolumeSpecName: "kube-api-access-ldbrl") pod "5518b1eb-a8ba-4239-ab7f-a4426779168d" (UID: "5518b1eb-a8ba-4239-ab7f-a4426779168d"). InnerVolumeSpecName "kube-api-access-ldbrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.222652 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.262836 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.262862 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6aa925f-5516-4cba-82e6-098769e2d405-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.262873 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa925f-5516-4cba-82e6-098769e2d405-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.262882 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldbrl\" (UniqueName: \"kubernetes.io/projected/5518b1eb-a8ba-4239-ab7f-a4426779168d-kube-api-access-ldbrl\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.262890 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.262899 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9k6w\" (UniqueName: \"kubernetes.io/projected/b6aa925f-5516-4cba-82e6-098769e2d405-kube-api-access-h9k6w\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.262909 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6aa925f-5516-4cba-82e6-098769e2d405-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.262921 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5518b1eb-a8ba-4239-ab7f-a4426779168d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.262933 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5518b1eb-a8ba-4239-ab7f-a4426779168d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.488951 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.489030 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.562019 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:56:01 crc kubenswrapper[4764]: I0320 14:56:01.849735 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:56:02 crc kubenswrapper[4764]: I0320 14:56:02.004294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" event={"ID":"5518b1eb-a8ba-4239-ab7f-a4426779168d","Type":"ContainerDied","Data":"1242e17870c44166327c2c10e367efec7be339cdc3f22fdaa7400cabe57945ce"} Mar 20 14:56:02 crc kubenswrapper[4764]: I0320 14:56:02.004345 4764 scope.go:117] "RemoveContainer" containerID="512fb6f5954812756852b19f8551094aec6f7b66e70916ef6dd946fb8449a09f" Mar 20 14:56:02 crc kubenswrapper[4764]: I0320 14:56:02.004478 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66957cb65f-vr58f" Mar 20 14:56:02 crc kubenswrapper[4764]: I0320 14:56:02.008236 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" Mar 20 14:56:02 crc kubenswrapper[4764]: I0320 14:56:02.009852 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz" event={"ID":"b6aa925f-5516-4cba-82e6-098769e2d405","Type":"ContainerDied","Data":"300eb1a08ab9f9f25440a4e4f95b19047b7bd43c93d72aae0ca0f4182252db63"} Mar 20 14:56:02 crc kubenswrapper[4764]: I0320 14:56:02.047470 4764 scope.go:117] "RemoveContainer" containerID="815f86b1344910e974aaf5cda864d91105d9ed7454d8cee0988e8a70b8e8ca6c" Mar 20 14:56:02 crc kubenswrapper[4764]: I0320 14:56:02.057342 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:56:02 crc kubenswrapper[4764]: I0320 14:56:02.063675 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz"] Mar 20 14:56:02 crc kubenswrapper[4764]: I0320 14:56:02.069505 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58cf65c484-zw2jz"] Mar 20 14:56:02 crc kubenswrapper[4764]: I0320 14:56:02.094518 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66957cb65f-vr58f"] Mar 20 14:56:02 crc kubenswrapper[4764]: I0320 14:56:02.100865 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66957cb65f-vr58f"] Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.019771 4764 generic.go:334] "Generic (PLEG): container finished" podID="190d359f-af45-40db-a90e-91bc465e6e1f" containerID="979d0deadee6841075097d4296a42d6e48dd0f01f8c29c7a9373ca540b992b84" exitCode=0 Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.019905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566976-h6xpn" event={"ID":"190d359f-af45-40db-a90e-91bc465e6e1f","Type":"ContainerDied","Data":"979d0deadee6841075097d4296a42d6e48dd0f01f8c29c7a9373ca540b992b84"} Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.137830 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5518b1eb-a8ba-4239-ab7f-a4426779168d" path="/var/lib/kubelet/pods/5518b1eb-a8ba-4239-ab7f-a4426779168d/volumes" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.139051 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6aa925f-5516-4cba-82e6-098769e2d405" path="/var/lib/kubelet/pods/b6aa925f-5516-4cba-82e6-098769e2d405/volumes" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.387772 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.465472 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.473206 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvnpx"] Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.604301 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw"] Mar 20 14:56:03 crc kubenswrapper[4764]: E0320 14:56:03.604876 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5518b1eb-a8ba-4239-ab7f-a4426779168d" containerName="controller-manager" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.604909 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5518b1eb-a8ba-4239-ab7f-a4426779168d" containerName="controller-manager" Mar 20 14:56:03 crc kubenswrapper[4764]: E0320 14:56:03.604934 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6aa925f-5516-4cba-82e6-098769e2d405" containerName="route-controller-manager" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.604951 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6aa925f-5516-4cba-82e6-098769e2d405" containerName="route-controller-manager" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.605217 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5518b1eb-a8ba-4239-ab7f-a4426779168d" containerName="controller-manager" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.605255 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6aa925f-5516-4cba-82e6-098769e2d405" containerName="route-controller-manager" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.606106 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.608300 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.608929 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz"] Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.609488 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.609560 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.609878 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.610001 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.610016 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.613964 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.615204 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.616072 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.616079 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.616263 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.616244 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.616340 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.630979 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw"] Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.635482 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.642449 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz"] Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.709963 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-client-ca\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.710134 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/296d0cc6-b7eb-4d1f-a443-8d53147742ec-client-ca\") pod \"route-controller-manager-7f6db9f9f4-qm4jz\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.710209 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpccp\" (UniqueName: \"kubernetes.io/projected/38a811ce-b082-41da-8661-4f9979d3f78d-kube-api-access-rpccp\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.710254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/296d0cc6-b7eb-4d1f-a443-8d53147742ec-serving-cert\") pod \"route-controller-manager-7f6db9f9f4-qm4jz\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.710292 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296d0cc6-b7eb-4d1f-a443-8d53147742ec-config\") pod \"route-controller-manager-7f6db9f9f4-qm4jz\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.710369 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-config\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.710431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-proxy-ca-bundles\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.710569 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38a811ce-b082-41da-8661-4f9979d3f78d-serving-cert\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.710596 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gs84\" (UniqueName: \"kubernetes.io/projected/296d0cc6-b7eb-4d1f-a443-8d53147742ec-kube-api-access-2gs84\") pod \"route-controller-manager-7f6db9f9f4-qm4jz\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.812269 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/296d0cc6-b7eb-4d1f-a443-8d53147742ec-client-ca\") pod \"route-controller-manager-7f6db9f9f4-qm4jz\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.812334 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpccp\" (UniqueName: \"kubernetes.io/projected/38a811ce-b082-41da-8661-4f9979d3f78d-kube-api-access-rpccp\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.812369 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/296d0cc6-b7eb-4d1f-a443-8d53147742ec-serving-cert\") pod \"route-controller-manager-7f6db9f9f4-qm4jz\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.812433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296d0cc6-b7eb-4d1f-a443-8d53147742ec-config\") pod \"route-controller-manager-7f6db9f9f4-qm4jz\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.812480 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-config\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.812506 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-proxy-ca-bundles\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.812551 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38a811ce-b082-41da-8661-4f9979d3f78d-serving-cert\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.812584 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gs84\" (UniqueName: \"kubernetes.io/projected/296d0cc6-b7eb-4d1f-a443-8d53147742ec-kube-api-access-2gs84\") pod \"route-controller-manager-7f6db9f9f4-qm4jz\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.812613 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-client-ca\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.813825 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-client-ca\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.814662 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-proxy-ca-bundles\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.815191 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-config\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.815678 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296d0cc6-b7eb-4d1f-a443-8d53147742ec-config\") pod \"route-controller-manager-7f6db9f9f4-qm4jz\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.816595 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/296d0cc6-b7eb-4d1f-a443-8d53147742ec-client-ca\") pod \"route-controller-manager-7f6db9f9f4-qm4jz\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.824290 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/296d0cc6-b7eb-4d1f-a443-8d53147742ec-serving-cert\") pod \"route-controller-manager-7f6db9f9f4-qm4jz\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.824526 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38a811ce-b082-41da-8661-4f9979d3f78d-serving-cert\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.852349 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpccp\" (UniqueName: \"kubernetes.io/projected/38a811ce-b082-41da-8661-4f9979d3f78d-kube-api-access-rpccp\") pod \"controller-manager-f6d86c7f7-l4gnw\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.863461 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gs84\" (UniqueName: \"kubernetes.io/projected/296d0cc6-b7eb-4d1f-a443-8d53147742ec-kube-api-access-2gs84\") pod \"route-controller-manager-7f6db9f9f4-qm4jz\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.952961 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:03 crc kubenswrapper[4764]: I0320 14:56:03.980843 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.030950 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hvnpx" podUID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" containerName="registry-server" containerID="cri-o://fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6" gracePeriod=2 Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.364615 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566976-h6xpn" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.406636 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.422633 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqzwj\" (UniqueName: \"kubernetes.io/projected/190d359f-af45-40db-a90e-91bc465e6e1f-kube-api-access-bqzwj\") pod \"190d359f-af45-40db-a90e-91bc465e6e1f\" (UID: \"190d359f-af45-40db-a90e-91bc465e6e1f\") " Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.422955 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6990bd44-c839-4dcd-bb4a-8d9da96bf644-catalog-content\") pod \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\" (UID: \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\") " Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.423077 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6990bd44-c839-4dcd-bb4a-8d9da96bf644-utilities\") pod \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\" (UID: \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\") " Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.423186 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjz2f\" (UniqueName: \"kubernetes.io/projected/6990bd44-c839-4dcd-bb4a-8d9da96bf644-kube-api-access-tjz2f\") pod \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\" (UID: \"6990bd44-c839-4dcd-bb4a-8d9da96bf644\") " Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.424413 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6990bd44-c839-4dcd-bb4a-8d9da96bf644-utilities" (OuterVolumeSpecName: "utilities") pod "6990bd44-c839-4dcd-bb4a-8d9da96bf644" (UID: "6990bd44-c839-4dcd-bb4a-8d9da96bf644"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.431170 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6990bd44-c839-4dcd-bb4a-8d9da96bf644-kube-api-access-tjz2f" (OuterVolumeSpecName: "kube-api-access-tjz2f") pod "6990bd44-c839-4dcd-bb4a-8d9da96bf644" (UID: "6990bd44-c839-4dcd-bb4a-8d9da96bf644"). InnerVolumeSpecName "kube-api-access-tjz2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.431557 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/190d359f-af45-40db-a90e-91bc465e6e1f-kube-api-access-bqzwj" (OuterVolumeSpecName: "kube-api-access-bqzwj") pod "190d359f-af45-40db-a90e-91bc465e6e1f" (UID: "190d359f-af45-40db-a90e-91bc465e6e1f"). InnerVolumeSpecName "kube-api-access-bqzwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.446146 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw"] Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.479089 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nphvv"] Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.479515 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nphvv" podUID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" containerName="registry-server" containerID="cri-o://adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534" gracePeriod=2 Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.502149 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6990bd44-c839-4dcd-bb4a-8d9da96bf644-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6990bd44-c839-4dcd-bb4a-8d9da96bf644" (UID: "6990bd44-c839-4dcd-bb4a-8d9da96bf644"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.521029 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz"] Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.524356 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6990bd44-c839-4dcd-bb4a-8d9da96bf644-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.524491 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6990bd44-c839-4dcd-bb4a-8d9da96bf644-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.524559 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjz2f\" (UniqueName: \"kubernetes.io/projected/6990bd44-c839-4dcd-bb4a-8d9da96bf644-kube-api-access-tjz2f\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.524617 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqzwj\" (UniqueName: \"kubernetes.io/projected/190d359f-af45-40db-a90e-91bc465e6e1f-kube-api-access-bqzwj\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.819621 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.930498 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmzg8\" (UniqueName: \"kubernetes.io/projected/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-kube-api-access-hmzg8\") pod \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\" (UID: \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\") " Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.930588 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-catalog-content\") pod \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\" (UID: \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\") " Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.930637 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-utilities\") pod \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\" (UID: \"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc\") " Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.931653 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-utilities" (OuterVolumeSpecName: "utilities") pod "429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" (UID: "429a82b0-5f61-4d42-a0d2-2fcb566f0bcc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.935935 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-kube-api-access-hmzg8" (OuterVolumeSpecName: "kube-api-access-hmzg8") pod "429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" (UID: "429a82b0-5f61-4d42-a0d2-2fcb566f0bcc"). InnerVolumeSpecName "kube-api-access-hmzg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:56:04 crc kubenswrapper[4764]: I0320 14:56:04.975717 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" (UID: "429a82b0-5f61-4d42-a0d2-2fcb566f0bcc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.031980 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.032010 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.032020 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmzg8\" (UniqueName: \"kubernetes.io/projected/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc-kube-api-access-hmzg8\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.037341 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" event={"ID":"38a811ce-b082-41da-8661-4f9979d3f78d","Type":"ContainerStarted","Data":"da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358"} Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.037399 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" event={"ID":"38a811ce-b082-41da-8661-4f9979d3f78d","Type":"ContainerStarted","Data":"8a2cd1b71aff03bc9ce9cce1d74107dac2da8e0633c6021c2d9a4babc9a607cc"} Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.037418 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.038617 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" event={"ID":"296d0cc6-b7eb-4d1f-a443-8d53147742ec","Type":"ContainerStarted","Data":"78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3"} Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.038657 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" event={"ID":"296d0cc6-b7eb-4d1f-a443-8d53147742ec","Type":"ContainerStarted","Data":"3c08db44426e0f92826fda32527cc58c394d55e0887c5e6e488cd62308421822"} Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.038929 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.040841 4764 generic.go:334] "Generic (PLEG): container finished" podID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" containerID="adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534" exitCode=0 Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.040882 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nphvv" event={"ID":"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc","Type":"ContainerDied","Data":"adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534"} Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.040901 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nphvv" event={"ID":"429a82b0-5f61-4d42-a0d2-2fcb566f0bcc","Type":"ContainerDied","Data":"c70e8561aa6eb92ac1a8dd26b3d94394e3de043e4cdbc051523d739ecf33788e"} Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.040918 4764 scope.go:117] "RemoveContainer" containerID="adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.040969 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nphvv" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.041426 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.043487 4764 generic.go:334] "Generic (PLEG): container finished" podID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" containerID="fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6" exitCode=0 Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.043560 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvnpx" event={"ID":"6990bd44-c839-4dcd-bb4a-8d9da96bf644","Type":"ContainerDied","Data":"fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6"} Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.043593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvnpx" event={"ID":"6990bd44-c839-4dcd-bb4a-8d9da96bf644","Type":"ContainerDied","Data":"e9a2c4e839a18275c09bc7a1f7bdbe7d54025e9a91aada75523699deaf717e4d"} Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.043695 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvnpx" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.049936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566976-h6xpn" event={"ID":"190d359f-af45-40db-a90e-91bc465e6e1f","Type":"ContainerDied","Data":"3dde5c03c1d2e10c1815cdf397342a1e721e8af02d514a7036ed3ebdf230a460"} Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.049962 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dde5c03c1d2e10c1815cdf397342a1e721e8af02d514a7036ed3ebdf230a460" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.050002 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566976-h6xpn" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.054230 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" podStartSLOduration=6.054219012 podStartE2EDuration="6.054219012s" podCreationTimestamp="2026-03-20 14:55:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:56:05.054155389 +0000 UTC m=+286.670344518" watchObservedRunningTime="2026-03-20 14:56:05.054219012 +0000 UTC m=+286.670408141" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.054527 4764 scope.go:117] "RemoveContainer" containerID="bb0959f89b57b0aa7a64d245fd317edc0911a289f860eb721d74ebdfc62a3283" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.074918 4764 scope.go:117] "RemoveContainer" containerID="7a2de8567b48c0644dc6fc7366002730eec8bd3ff6d8903af86d090757cc38ea" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.100438 4764 scope.go:117] "RemoveContainer" containerID="adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534" Mar 20 14:56:05 crc kubenswrapper[4764]: E0320 14:56:05.100854 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534\": container with ID starting with adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534 not found: ID does not exist" containerID="adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.100896 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534"} err="failed to get container status \"adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534\": rpc error: code = NotFound desc = could not find container \"adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534\": container with ID starting with adb2de5aa14fde7c375ae1df3afa1c1f553c58270b5a2367be68328603c74534 not found: ID does not exist" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.100920 4764 scope.go:117] "RemoveContainer" containerID="bb0959f89b57b0aa7a64d245fd317edc0911a289f860eb721d74ebdfc62a3283" Mar 20 14:56:05 crc kubenswrapper[4764]: E0320 14:56:05.101244 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb0959f89b57b0aa7a64d245fd317edc0911a289f860eb721d74ebdfc62a3283\": container with ID starting with bb0959f89b57b0aa7a64d245fd317edc0911a289f860eb721d74ebdfc62a3283 not found: ID does not exist" containerID="bb0959f89b57b0aa7a64d245fd317edc0911a289f860eb721d74ebdfc62a3283" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.101263 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0959f89b57b0aa7a64d245fd317edc0911a289f860eb721d74ebdfc62a3283"} err="failed to get container status \"bb0959f89b57b0aa7a64d245fd317edc0911a289f860eb721d74ebdfc62a3283\": rpc error: code = NotFound desc = could not find container \"bb0959f89b57b0aa7a64d245fd317edc0911a289f860eb721d74ebdfc62a3283\": container with ID starting with bb0959f89b57b0aa7a64d245fd317edc0911a289f860eb721d74ebdfc62a3283 not found: ID does not exist" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.101275 4764 scope.go:117] "RemoveContainer" containerID="7a2de8567b48c0644dc6fc7366002730eec8bd3ff6d8903af86d090757cc38ea" Mar 20 14:56:05 crc kubenswrapper[4764]: E0320 14:56:05.101723 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a2de8567b48c0644dc6fc7366002730eec8bd3ff6d8903af86d090757cc38ea\": container with ID starting with 7a2de8567b48c0644dc6fc7366002730eec8bd3ff6d8903af86d090757cc38ea not found: ID does not exist" containerID="7a2de8567b48c0644dc6fc7366002730eec8bd3ff6d8903af86d090757cc38ea" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.101744 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a2de8567b48c0644dc6fc7366002730eec8bd3ff6d8903af86d090757cc38ea"} err="failed to get container status \"7a2de8567b48c0644dc6fc7366002730eec8bd3ff6d8903af86d090757cc38ea\": rpc error: code = NotFound desc = could not find container \"7a2de8567b48c0644dc6fc7366002730eec8bd3ff6d8903af86d090757cc38ea\": container with ID starting with 7a2de8567b48c0644dc6fc7366002730eec8bd3ff6d8903af86d090757cc38ea not found: ID does not exist" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.101759 4764 scope.go:117] "RemoveContainer" containerID="fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.124947 4764 scope.go:117] "RemoveContainer" containerID="be7974a9741407b6bd3f047d7129d5e168680377f625c4b261328d2703c00ed4" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.128890 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" podStartSLOduration=6.1288698010000005 podStartE2EDuration="6.128869801s" podCreationTimestamp="2026-03-20 14:55:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:56:05.124685132 +0000 UTC m=+286.740874261" watchObservedRunningTime="2026-03-20 14:56:05.128869801 +0000 UTC m=+286.745058930" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.141562 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nphvv"] Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.146832 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nphvv"] Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.148351 4764 scope.go:117] "RemoveContainer" containerID="aceab4a8cc7db52383ba454867ae703a95545338583456707bb4eb4b3dad235b" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.161343 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvnpx"] Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.168972 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hvnpx"] Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.173341 4764 scope.go:117] "RemoveContainer" containerID="fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6" Mar 20 14:56:05 crc kubenswrapper[4764]: E0320 14:56:05.173744 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6\": container with ID starting with fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6 not found: ID does not exist" containerID="fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.173781 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6"} err="failed to get container status \"fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6\": rpc error: code = NotFound desc = could not find container \"fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6\": container with ID starting with fe3ddfa8c0d3a899ee8f5f3c2339fe40b4e081dfeef6b7879f17401cc6705bf6 not found: ID does not exist" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.173805 4764 scope.go:117] "RemoveContainer" containerID="be7974a9741407b6bd3f047d7129d5e168680377f625c4b261328d2703c00ed4" Mar 20 14:56:05 crc kubenswrapper[4764]: E0320 14:56:05.174121 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7974a9741407b6bd3f047d7129d5e168680377f625c4b261328d2703c00ed4\": container with ID starting with be7974a9741407b6bd3f047d7129d5e168680377f625c4b261328d2703c00ed4 not found: ID does not exist" containerID="be7974a9741407b6bd3f047d7129d5e168680377f625c4b261328d2703c00ed4" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.174147 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7974a9741407b6bd3f047d7129d5e168680377f625c4b261328d2703c00ed4"} err="failed to get container status \"be7974a9741407b6bd3f047d7129d5e168680377f625c4b261328d2703c00ed4\": rpc error: code = NotFound desc = could not find container \"be7974a9741407b6bd3f047d7129d5e168680377f625c4b261328d2703c00ed4\": container with ID starting with be7974a9741407b6bd3f047d7129d5e168680377f625c4b261328d2703c00ed4 not found: ID does not exist" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.174165 4764 scope.go:117] "RemoveContainer" containerID="aceab4a8cc7db52383ba454867ae703a95545338583456707bb4eb4b3dad235b" Mar 20 14:56:05 crc kubenswrapper[4764]: E0320 14:56:05.174455 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aceab4a8cc7db52383ba454867ae703a95545338583456707bb4eb4b3dad235b\": container with ID starting with aceab4a8cc7db52383ba454867ae703a95545338583456707bb4eb4b3dad235b not found: ID does not exist" containerID="aceab4a8cc7db52383ba454867ae703a95545338583456707bb4eb4b3dad235b" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.174483 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aceab4a8cc7db52383ba454867ae703a95545338583456707bb4eb4b3dad235b"} err="failed to get container status \"aceab4a8cc7db52383ba454867ae703a95545338583456707bb4eb4b3dad235b\": rpc error: code = NotFound desc = could not find container \"aceab4a8cc7db52383ba454867ae703a95545338583456707bb4eb4b3dad235b\": container with ID starting with aceab4a8cc7db52383ba454867ae703a95545338583456707bb4eb4b3dad235b not found: ID does not exist" Mar 20 14:56:05 crc kubenswrapper[4764]: I0320 14:56:05.181756 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:07 crc kubenswrapper[4764]: I0320 14:56:07.132305 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" path="/var/lib/kubelet/pods/429a82b0-5f61-4d42-a0d2-2fcb566f0bcc/volumes" Mar 20 14:56:07 crc kubenswrapper[4764]: I0320 14:56:07.133774 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" path="/var/lib/kubelet/pods/6990bd44-c839-4dcd-bb4a-8d9da96bf644/volumes" Mar 20 14:56:08 crc kubenswrapper[4764]: I0320 14:56:08.444260 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:56:08 crc kubenswrapper[4764]: I0320 14:56:08.444808 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:56:08 crc kubenswrapper[4764]: I0320 14:56:08.444887 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:56:08 crc kubenswrapper[4764]: I0320 14:56:08.445997 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:56:08 crc kubenswrapper[4764]: I0320 14:56:08.446086 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c" gracePeriod=600 Mar 20 14:56:08 crc kubenswrapper[4764]: I0320 14:56:08.503578 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" podUID="c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" containerName="oauth-openshift" containerID="cri-o://adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f" gracePeriod=15 Mar 20 14:56:08 crc kubenswrapper[4764]: I0320 14:56:08.961456 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.090955 4764 generic.go:334] "Generic (PLEG): container finished" podID="c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" containerID="adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f" exitCode=0 Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.091049 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" event={"ID":"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870","Type":"ContainerDied","Data":"adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f"} Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.091101 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" event={"ID":"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870","Type":"ContainerDied","Data":"0246170a27cd6a9152ac3f371af28627e89e47cda007cfb6db92b9a0eeeecdbb"} Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.091137 4764 scope.go:117] "RemoveContainer" containerID="adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.091365 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kctmb" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101263 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-audit-dir\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101353 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-trusted-ca-bundle\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101421 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-provider-selection\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101548 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-session\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101582 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mprhh\" (UniqueName: \"kubernetes.io/projected/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-kube-api-access-mprhh\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101608 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-login\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-idp-0-file-data\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101663 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-ocp-branding-template\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101701 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-error\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101746 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-service-ca\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101776 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-cliconfig\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101837 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-audit-policies\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101865 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-router-certs\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.101888 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-serving-cert\") pod \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\" (UID: \"c0b7cf43-9ff0-4ab9-a884-955f2bc6e870\") " Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.102590 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.105009 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.105949 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.106403 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.106475 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.107092 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c" exitCode=0 Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.107137 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c"} Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.107171 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"f9d6b1fc48480518320cf9896f1bb5b61939af837c02eef3fcf8ed18bad58336"} Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.110768 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-kube-api-access-mprhh" (OuterVolumeSpecName: "kube-api-access-mprhh") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "kube-api-access-mprhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.111248 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.111819 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.112982 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.113630 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.113781 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.113787 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.114048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.119916 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" (UID: "c0b7cf43-9ff0-4ab9-a884-955f2bc6e870"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.129808 4764 scope.go:117] "RemoveContainer" containerID="adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f" Mar 20 14:56:09 crc kubenswrapper[4764]: E0320 14:56:09.130653 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f\": container with ID starting with adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f not found: ID does not exist" containerID="adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.130723 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f"} err="failed to get container status \"adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f\": rpc error: code = NotFound desc = could not find container \"adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f\": container with ID starting with adf1d23311f8e720fb4776267237c539c13b6e6a90fa418af3ca53de63b2f12f not found: ID does not exist" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203460 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203500 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mprhh\" (UniqueName: \"kubernetes.io/projected/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-kube-api-access-mprhh\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203513 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203527 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203540 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203553 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203564 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203577 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203590 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203603 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203616 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203629 4764 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203641 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.203654 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.419336 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kctmb"] Mar 20 14:56:09 crc kubenswrapper[4764]: I0320 14:56:09.424216 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kctmb"] Mar 20 14:56:11 crc kubenswrapper[4764]: I0320 14:56:11.136468 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" path="/var/lib/kubelet/pods/c0b7cf43-9ff0-4ab9-a884-955f2bc6e870/volumes" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.617365 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-748c889698-r9252"] Mar 20 14:56:18 crc kubenswrapper[4764]: E0320 14:56:18.619730 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" containerName="extract-content" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.619770 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" containerName="extract-content" Mar 20 14:56:18 crc kubenswrapper[4764]: E0320 14:56:18.619787 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" containerName="registry-server" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.619799 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" containerName="registry-server" Mar 20 14:56:18 crc kubenswrapper[4764]: E0320 14:56:18.619825 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" containerName="extract-utilities" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.619837 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" containerName="extract-utilities" Mar 20 14:56:18 crc kubenswrapper[4764]: E0320 14:56:18.619861 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190d359f-af45-40db-a90e-91bc465e6e1f" containerName="oc" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.619873 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="190d359f-af45-40db-a90e-91bc465e6e1f" containerName="oc" Mar 20 14:56:18 crc kubenswrapper[4764]: E0320 14:56:18.619933 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" containerName="extract-content" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.619948 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" containerName="extract-content" Mar 20 14:56:18 crc kubenswrapper[4764]: E0320 14:56:18.619965 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" containerName="oauth-openshift" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.619977 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" containerName="oauth-openshift" Mar 20 14:56:18 crc kubenswrapper[4764]: E0320 14:56:18.619995 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" containerName="extract-utilities" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.620007 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" containerName="extract-utilities" Mar 20 14:56:18 crc kubenswrapper[4764]: E0320 14:56:18.620022 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" containerName="registry-server" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.620033 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" containerName="registry-server" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.620199 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0b7cf43-9ff0-4ab9-a884-955f2bc6e870" containerName="oauth-openshift" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.620219 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="190d359f-af45-40db-a90e-91bc465e6e1f" containerName="oc" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.620238 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="429a82b0-5f61-4d42-a0d2-2fcb566f0bcc" containerName="registry-server" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.620252 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6990bd44-c839-4dcd-bb4a-8d9da96bf644" containerName="registry-server" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.620899 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.624919 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.625283 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.625360 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.626878 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.627063 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.627917 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.628102 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.628643 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.628696 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.628744 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.628790 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.629064 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.643803 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b66ws\" (UniqueName: \"kubernetes.io/projected/35d1ed44-a4db-43d7-9904-72c12189afad-kube-api-access-b66ws\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.643871 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-serving-cert\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.643924 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-user-template-error\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.643958 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.644057 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-router-certs\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.644137 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-cliconfig\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.644183 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-session\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.644212 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.644285 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.644341 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.644449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35d1ed44-a4db-43d7-9904-72c12189afad-audit-policies\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.644630 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-service-ca\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.644744 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35d1ed44-a4db-43d7-9904-72c12189afad-audit-dir\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.644804 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.644860 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-user-template-login\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.647082 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-748c889698-r9252"] Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.650949 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.663549 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-cliconfig\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-session\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746493 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746553 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746602 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35d1ed44-a4db-43d7-9904-72c12189afad-audit-policies\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746647 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-service-ca\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746697 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35d1ed44-a4db-43d7-9904-72c12189afad-audit-dir\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746732 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746768 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-user-template-login\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b66ws\" (UniqueName: \"kubernetes.io/projected/35d1ed44-a4db-43d7-9904-72c12189afad-kube-api-access-b66ws\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746870 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-serving-cert\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746912 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-user-template-error\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746944 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.746982 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-router-certs\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.747600 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35d1ed44-a4db-43d7-9904-72c12189afad-audit-dir\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.747881 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-cliconfig\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.749949 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35d1ed44-a4db-43d7-9904-72c12189afad-audit-policies\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.750073 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-service-ca\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.751602 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.755035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-user-template-error\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.755166 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-session\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.755675 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.755716 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-router-certs\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.756578 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.757560 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-user-template-login\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.758139 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.766313 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35d1ed44-a4db-43d7-9904-72c12189afad-v4-0-config-system-serving-cert\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.776363 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b66ws\" (UniqueName: \"kubernetes.io/projected/35d1ed44-a4db-43d7-9904-72c12189afad-kube-api-access-b66ws\") pod \"oauth-openshift-748c889698-r9252\" (UID: \"35d1ed44-a4db-43d7-9904-72c12189afad\") " pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:18 crc kubenswrapper[4764]: I0320 14:56:18.964697 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.161971 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw"] Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.162231 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" podUID="38a811ce-b082-41da-8661-4f9979d3f78d" containerName="controller-manager" containerID="cri-o://da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358" gracePeriod=30 Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.246327 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz"] Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.246783 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" podUID="296d0cc6-b7eb-4d1f-a443-8d53147742ec" containerName="route-controller-manager" containerID="cri-o://78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3" gracePeriod=30 Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.441352 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-748c889698-r9252"] Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.673048 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.698617 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.782259 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-client-ca\") pod \"38a811ce-b082-41da-8661-4f9979d3f78d\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.782302 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296d0cc6-b7eb-4d1f-a443-8d53147742ec-config\") pod \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.782348 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/296d0cc6-b7eb-4d1f-a443-8d53147742ec-serving-cert\") pod \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.782416 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38a811ce-b082-41da-8661-4f9979d3f78d-serving-cert\") pod \"38a811ce-b082-41da-8661-4f9979d3f78d\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.782500 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/296d0cc6-b7eb-4d1f-a443-8d53147742ec-client-ca\") pod \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.782525 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gs84\" (UniqueName: \"kubernetes.io/projected/296d0cc6-b7eb-4d1f-a443-8d53147742ec-kube-api-access-2gs84\") pod \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\" (UID: \"296d0cc6-b7eb-4d1f-a443-8d53147742ec\") " Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.782542 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-proxy-ca-bundles\") pod \"38a811ce-b082-41da-8661-4f9979d3f78d\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.782581 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpccp\" (UniqueName: \"kubernetes.io/projected/38a811ce-b082-41da-8661-4f9979d3f78d-kube-api-access-rpccp\") pod \"38a811ce-b082-41da-8661-4f9979d3f78d\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.782605 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-config\") pod \"38a811ce-b082-41da-8661-4f9979d3f78d\" (UID: \"38a811ce-b082-41da-8661-4f9979d3f78d\") " Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.783355 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-config" (OuterVolumeSpecName: "config") pod "38a811ce-b082-41da-8661-4f9979d3f78d" (UID: "38a811ce-b082-41da-8661-4f9979d3f78d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.783475 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-client-ca" (OuterVolumeSpecName: "client-ca") pod "38a811ce-b082-41da-8661-4f9979d3f78d" (UID: "38a811ce-b082-41da-8661-4f9979d3f78d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.783479 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/296d0cc6-b7eb-4d1f-a443-8d53147742ec-client-ca" (OuterVolumeSpecName: "client-ca") pod "296d0cc6-b7eb-4d1f-a443-8d53147742ec" (UID: "296d0cc6-b7eb-4d1f-a443-8d53147742ec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.783488 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/296d0cc6-b7eb-4d1f-a443-8d53147742ec-config" (OuterVolumeSpecName: "config") pod "296d0cc6-b7eb-4d1f-a443-8d53147742ec" (UID: "296d0cc6-b7eb-4d1f-a443-8d53147742ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.783969 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "38a811ce-b082-41da-8661-4f9979d3f78d" (UID: "38a811ce-b082-41da-8661-4f9979d3f78d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.787624 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296d0cc6-b7eb-4d1f-a443-8d53147742ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "296d0cc6-b7eb-4d1f-a443-8d53147742ec" (UID: "296d0cc6-b7eb-4d1f-a443-8d53147742ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.787928 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a811ce-b082-41da-8661-4f9979d3f78d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "38a811ce-b082-41da-8661-4f9979d3f78d" (UID: "38a811ce-b082-41da-8661-4f9979d3f78d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.792536 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296d0cc6-b7eb-4d1f-a443-8d53147742ec-kube-api-access-2gs84" (OuterVolumeSpecName: "kube-api-access-2gs84") pod "296d0cc6-b7eb-4d1f-a443-8d53147742ec" (UID: "296d0cc6-b7eb-4d1f-a443-8d53147742ec"). InnerVolumeSpecName "kube-api-access-2gs84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.792936 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a811ce-b082-41da-8661-4f9979d3f78d-kube-api-access-rpccp" (OuterVolumeSpecName: "kube-api-access-rpccp") pod "38a811ce-b082-41da-8661-4f9979d3f78d" (UID: "38a811ce-b082-41da-8661-4f9979d3f78d"). InnerVolumeSpecName "kube-api-access-rpccp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.884708 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/296d0cc6-b7eb-4d1f-a443-8d53147742ec-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.884754 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.884778 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gs84\" (UniqueName: \"kubernetes.io/projected/296d0cc6-b7eb-4d1f-a443-8d53147742ec-kube-api-access-2gs84\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.884797 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpccp\" (UniqueName: \"kubernetes.io/projected/38a811ce-b082-41da-8661-4f9979d3f78d-kube-api-access-rpccp\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.884814 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.884831 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38a811ce-b082-41da-8661-4f9979d3f78d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.884847 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/296d0cc6-b7eb-4d1f-a443-8d53147742ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.884862 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/296d0cc6-b7eb-4d1f-a443-8d53147742ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:19 crc kubenswrapper[4764]: I0320 14:56:19.884878 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38a811ce-b082-41da-8661-4f9979d3f78d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.187786 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-748c889698-r9252" event={"ID":"35d1ed44-a4db-43d7-9904-72c12189afad","Type":"ContainerStarted","Data":"0bc8afa9c890d11c155e0ccad45cfdf9d269e41f56e7e501b8563d8cdfeb6147"} Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.187905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-748c889698-r9252" event={"ID":"35d1ed44-a4db-43d7-9904-72c12189afad","Type":"ContainerStarted","Data":"3246ef3c51a5b04fe5ebb4f6f982578637f028c597514096e8837c62c1c757d5"} Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.187940 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.189946 4764 generic.go:334] "Generic (PLEG): container finished" podID="38a811ce-b082-41da-8661-4f9979d3f78d" containerID="da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358" exitCode=0 Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.189984 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" event={"ID":"38a811ce-b082-41da-8661-4f9979d3f78d","Type":"ContainerDied","Data":"da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358"} Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.190038 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" event={"ID":"38a811ce-b082-41da-8661-4f9979d3f78d","Type":"ContainerDied","Data":"8a2cd1b71aff03bc9ce9cce1d74107dac2da8e0633c6021c2d9a4babc9a607cc"} Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.190040 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.190069 4764 scope.go:117] "RemoveContainer" containerID="da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.193094 4764 generic.go:334] "Generic (PLEG): container finished" podID="296d0cc6-b7eb-4d1f-a443-8d53147742ec" containerID="78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3" exitCode=0 Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.193140 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" event={"ID":"296d0cc6-b7eb-4d1f-a443-8d53147742ec","Type":"ContainerDied","Data":"78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3"} Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.193177 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" event={"ID":"296d0cc6-b7eb-4d1f-a443-8d53147742ec","Type":"ContainerDied","Data":"3c08db44426e0f92826fda32527cc58c394d55e0887c5e6e488cd62308421822"} Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.193273 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.217810 4764 scope.go:117] "RemoveContainer" containerID="da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358" Mar 20 14:56:20 crc kubenswrapper[4764]: E0320 14:56:20.218522 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358\": container with ID starting with da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358 not found: ID does not exist" containerID="da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.218601 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358"} err="failed to get container status \"da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358\": rpc error: code = NotFound desc = could not find container \"da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358\": container with ID starting with da3ca398e7ec277c8c36b5b6dca713bcdcad086aee76d50b36bfa247ab5d3358 not found: ID does not exist" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.218677 4764 scope.go:117] "RemoveContainer" containerID="78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.230290 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-748c889698-r9252" podStartSLOduration=37.23025871 podStartE2EDuration="37.23025871s" podCreationTimestamp="2026-03-20 14:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:56:20.219249458 +0000 UTC m=+301.835438657" watchObservedRunningTime="2026-03-20 14:56:20.23025871 +0000 UTC m=+301.846447879" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.265596 4764 scope.go:117] "RemoveContainer" containerID="78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3" Mar 20 14:56:20 crc kubenswrapper[4764]: E0320 14:56:20.274055 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3\": container with ID starting with 78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3 not found: ID does not exist" containerID="78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.274115 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3"} err="failed to get container status \"78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3\": rpc error: code = NotFound desc = could not find container \"78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3\": container with ID starting with 78404cd9499602a250e65b75a7c091f37f26ec0351425b04992a93c7dae2a8b3 not found: ID does not exist" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.283904 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw"] Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.287579 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f6d86c7f7-l4gnw"] Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.292048 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz"] Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.295524 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f6db9f9f4-qm4jz"] Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.365575 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-748c889698-r9252" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.620555 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5999d99c6c-gskr2"] Mar 20 14:56:20 crc kubenswrapper[4764]: E0320 14:56:20.620804 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a811ce-b082-41da-8661-4f9979d3f78d" containerName="controller-manager" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.620818 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a811ce-b082-41da-8661-4f9979d3f78d" containerName="controller-manager" Mar 20 14:56:20 crc kubenswrapper[4764]: E0320 14:56:20.620832 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="296d0cc6-b7eb-4d1f-a443-8d53147742ec" containerName="route-controller-manager" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.620842 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="296d0cc6-b7eb-4d1f-a443-8d53147742ec" containerName="route-controller-manager" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.620951 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a811ce-b082-41da-8661-4f9979d3f78d" containerName="controller-manager" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.620962 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="296d0cc6-b7eb-4d1f-a443-8d53147742ec" containerName="route-controller-manager" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.621417 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.622388 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl"] Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.622813 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.625954 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.630724 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl"] Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.631853 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.632185 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.632340 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.633499 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.633546 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.633689 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.633826 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.633950 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.633960 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.634067 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.634081 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.637427 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5999d99c6c-gskr2"] Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.640020 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.693830 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15ba3b14-b4cb-4dea-ad3b-ff397b996899-client-ca\") pod \"route-controller-manager-79fc999cdf-qbqjl\" (UID: \"15ba3b14-b4cb-4dea-ad3b-ff397b996899\") " pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.693867 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sggv\" (UniqueName: \"kubernetes.io/projected/7f1c734f-cf29-446c-8f39-9fa4b7575654-kube-api-access-6sggv\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.693893 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c7qq\" (UniqueName: \"kubernetes.io/projected/15ba3b14-b4cb-4dea-ad3b-ff397b996899-kube-api-access-9c7qq\") pod \"route-controller-manager-79fc999cdf-qbqjl\" (UID: \"15ba3b14-b4cb-4dea-ad3b-ff397b996899\") " pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.693925 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15ba3b14-b4cb-4dea-ad3b-ff397b996899-serving-cert\") pod \"route-controller-manager-79fc999cdf-qbqjl\" (UID: \"15ba3b14-b4cb-4dea-ad3b-ff397b996899\") " pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.693940 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f1c734f-cf29-446c-8f39-9fa4b7575654-config\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.693965 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f1c734f-cf29-446c-8f39-9fa4b7575654-serving-cert\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.694036 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ba3b14-b4cb-4dea-ad3b-ff397b996899-config\") pod \"route-controller-manager-79fc999cdf-qbqjl\" (UID: \"15ba3b14-b4cb-4dea-ad3b-ff397b996899\") " pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.694068 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f1c734f-cf29-446c-8f39-9fa4b7575654-client-ca\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.694084 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7f1c734f-cf29-446c-8f39-9fa4b7575654-proxy-ca-bundles\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.795011 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c7qq\" (UniqueName: \"kubernetes.io/projected/15ba3b14-b4cb-4dea-ad3b-ff397b996899-kube-api-access-9c7qq\") pod \"route-controller-manager-79fc999cdf-qbqjl\" (UID: \"15ba3b14-b4cb-4dea-ad3b-ff397b996899\") " pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.795122 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15ba3b14-b4cb-4dea-ad3b-ff397b996899-serving-cert\") pod \"route-controller-manager-79fc999cdf-qbqjl\" (UID: \"15ba3b14-b4cb-4dea-ad3b-ff397b996899\") " pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.795163 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f1c734f-cf29-446c-8f39-9fa4b7575654-config\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.795257 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f1c734f-cf29-446c-8f39-9fa4b7575654-serving-cert\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.796696 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ba3b14-b4cb-4dea-ad3b-ff397b996899-config\") pod \"route-controller-manager-79fc999cdf-qbqjl\" (UID: \"15ba3b14-b4cb-4dea-ad3b-ff397b996899\") " pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.796794 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7f1c734f-cf29-446c-8f39-9fa4b7575654-proxy-ca-bundles\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.796832 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f1c734f-cf29-446c-8f39-9fa4b7575654-client-ca\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.796891 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15ba3b14-b4cb-4dea-ad3b-ff397b996899-client-ca\") pod \"route-controller-manager-79fc999cdf-qbqjl\" (UID: \"15ba3b14-b4cb-4dea-ad3b-ff397b996899\") " pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.796945 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sggv\" (UniqueName: \"kubernetes.io/projected/7f1c734f-cf29-446c-8f39-9fa4b7575654-kube-api-access-6sggv\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.797720 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f1c734f-cf29-446c-8f39-9fa4b7575654-config\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.798320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ba3b14-b4cb-4dea-ad3b-ff397b996899-config\") pod \"route-controller-manager-79fc999cdf-qbqjl\" (UID: \"15ba3b14-b4cb-4dea-ad3b-ff397b996899\") " pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.799073 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f1c734f-cf29-446c-8f39-9fa4b7575654-client-ca\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.799258 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15ba3b14-b4cb-4dea-ad3b-ff397b996899-serving-cert\") pod \"route-controller-manager-79fc999cdf-qbqjl\" (UID: \"15ba3b14-b4cb-4dea-ad3b-ff397b996899\") " pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.799321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7f1c734f-cf29-446c-8f39-9fa4b7575654-proxy-ca-bundles\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.799687 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15ba3b14-b4cb-4dea-ad3b-ff397b996899-client-ca\") pod \"route-controller-manager-79fc999cdf-qbqjl\" (UID: \"15ba3b14-b4cb-4dea-ad3b-ff397b996899\") " pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.805566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f1c734f-cf29-446c-8f39-9fa4b7575654-serving-cert\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.812131 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c7qq\" (UniqueName: \"kubernetes.io/projected/15ba3b14-b4cb-4dea-ad3b-ff397b996899-kube-api-access-9c7qq\") pod \"route-controller-manager-79fc999cdf-qbqjl\" (UID: \"15ba3b14-b4cb-4dea-ad3b-ff397b996899\") " pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.832321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sggv\" (UniqueName: \"kubernetes.io/projected/7f1c734f-cf29-446c-8f39-9fa4b7575654-kube-api-access-6sggv\") pod \"controller-manager-5999d99c6c-gskr2\" (UID: \"7f1c734f-cf29-446c-8f39-9fa4b7575654\") " pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.940877 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:20 crc kubenswrapper[4764]: I0320 14:56:20.946681 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:21 crc kubenswrapper[4764]: I0320 14:56:21.145315 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="296d0cc6-b7eb-4d1f-a443-8d53147742ec" path="/var/lib/kubelet/pods/296d0cc6-b7eb-4d1f-a443-8d53147742ec/volumes" Mar 20 14:56:21 crc kubenswrapper[4764]: I0320 14:56:21.150844 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a811ce-b082-41da-8661-4f9979d3f78d" path="/var/lib/kubelet/pods/38a811ce-b082-41da-8661-4f9979d3f78d/volumes" Mar 20 14:56:21 crc kubenswrapper[4764]: I0320 14:56:21.241272 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5999d99c6c-gskr2"] Mar 20 14:56:21 crc kubenswrapper[4764]: W0320 14:56:21.252243 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f1c734f_cf29_446c_8f39_9fa4b7575654.slice/crio-c41fb69825082f3792f7394778e4b81ee7e6b7d95b112cb7e17ae6f657e35ce7 WatchSource:0}: Error finding container c41fb69825082f3792f7394778e4b81ee7e6b7d95b112cb7e17ae6f657e35ce7: Status 404 returned error can't find the container with id c41fb69825082f3792f7394778e4b81ee7e6b7d95b112cb7e17ae6f657e35ce7 Mar 20 14:56:21 crc kubenswrapper[4764]: I0320 14:56:21.291474 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl"] Mar 20 14:56:21 crc kubenswrapper[4764]: W0320 14:56:21.305762 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15ba3b14_b4cb_4dea_ad3b_ff397b996899.slice/crio-4388c8703ba6108d227abd63855a3d932598fd3b0b00bc96aa4ff9d36ce7bf56 WatchSource:0}: Error finding container 4388c8703ba6108d227abd63855a3d932598fd3b0b00bc96aa4ff9d36ce7bf56: Status 404 returned error can't find the container with id 4388c8703ba6108d227abd63855a3d932598fd3b0b00bc96aa4ff9d36ce7bf56 Mar 20 14:56:22 crc kubenswrapper[4764]: I0320 14:56:22.216883 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" event={"ID":"7f1c734f-cf29-446c-8f39-9fa4b7575654","Type":"ContainerStarted","Data":"6a45b370810b0a0013504ca0aa0e42ca92c529fd847a8be58bb25b7e3550b08c"} Mar 20 14:56:22 crc kubenswrapper[4764]: I0320 14:56:22.217928 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" event={"ID":"7f1c734f-cf29-446c-8f39-9fa4b7575654","Type":"ContainerStarted","Data":"c41fb69825082f3792f7394778e4b81ee7e6b7d95b112cb7e17ae6f657e35ce7"} Mar 20 14:56:22 crc kubenswrapper[4764]: I0320 14:56:22.219757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" event={"ID":"15ba3b14-b4cb-4dea-ad3b-ff397b996899","Type":"ContainerStarted","Data":"db7c5430a14fb4791b3ecdee5d1a4e3877c78b199fad71301ba7229b1fa14af9"} Mar 20 14:56:22 crc kubenswrapper[4764]: I0320 14:56:22.219885 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" event={"ID":"15ba3b14-b4cb-4dea-ad3b-ff397b996899","Type":"ContainerStarted","Data":"4388c8703ba6108d227abd63855a3d932598fd3b0b00bc96aa4ff9d36ce7bf56"} Mar 20 14:56:22 crc kubenswrapper[4764]: I0320 14:56:22.220200 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:22 crc kubenswrapper[4764]: I0320 14:56:22.227479 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" Mar 20 14:56:22 crc kubenswrapper[4764]: I0320 14:56:22.245524 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" podStartSLOduration=3.245484301 podStartE2EDuration="3.245484301s" podCreationTimestamp="2026-03-20 14:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:56:22.242992482 +0000 UTC m=+303.859181641" watchObservedRunningTime="2026-03-20 14:56:22.245484301 +0000 UTC m=+303.861673440" Mar 20 14:56:22 crc kubenswrapper[4764]: I0320 14:56:22.271120 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79fc999cdf-qbqjl" podStartSLOduration=3.271087583 podStartE2EDuration="3.271087583s" podCreationTimestamp="2026-03-20 14:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:56:22.270584285 +0000 UTC m=+303.886773434" watchObservedRunningTime="2026-03-20 14:56:22.271087583 +0000 UTC m=+303.887276722" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.183304 4764 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.184881 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.186444 4764 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.187129 4764 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.187280 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a" gracePeriod=15 Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.187362 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6" gracePeriod=15 Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.187533 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74" gracePeriod=15 Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.187152 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a" gracePeriod=15 Mar 20 14:56:23 crc kubenswrapper[4764]: E0320 14:56:23.187538 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.188024 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 14:56:23 crc kubenswrapper[4764]: E0320 14:56:23.188119 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.188201 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: E0320 14:56:23.188286 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.188411 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 14:56:23 crc kubenswrapper[4764]: E0320 14:56:23.189441 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.189532 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 14:56:23 crc kubenswrapper[4764]: E0320 14:56:23.190058 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190144 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: E0320 14:56:23.190219 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190284 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: E0320 14:56:23.190364 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190448 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 14:56:23 crc kubenswrapper[4764]: E0320 14:56:23.190525 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190599 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190735 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190749 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190765 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190774 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190782 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190792 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190803 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190814 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: E0320 14:56:23.190922 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190932 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: E0320 14:56:23.190944 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.190952 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.187839 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be" gracePeriod=15 Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.191072 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.233258 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.234965 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.235061 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.235105 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.235284 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.235296 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.238739 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.238906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.239458 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.239613 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.245482 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.246139 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.246604 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.246876 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.341762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.341813 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.341855 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.341906 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.341924 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.341967 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.342016 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.342046 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.342061 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.342108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.342253 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.342330 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.342459 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.342567 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.342744 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.342990 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: I0320 14:56:23.535482 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:56:23 crc kubenswrapper[4764]: W0320 14:56:23.569818 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-4461b039e7271183fe79e2bb78ebe69951a87218c403777d823cbfaf86eaf579 WatchSource:0}: Error finding container 4461b039e7271183fe79e2bb78ebe69951a87218c403777d823cbfaf86eaf579: Status 404 returned error can't find the container with id 4461b039e7271183fe79e2bb78ebe69951a87218c403777d823cbfaf86eaf579 Mar 20 14:56:23 crc kubenswrapper[4764]: E0320 14:56:23.575731 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.64:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9480aad94f68 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:56:23.573647208 +0000 UTC m=+305.189836347,LastTimestamp:2026-03-20 14:56:23.573647208 +0000 UTC m=+305.189836347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.241572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9fa7b788a18d9e903bc97a714978615016c12c52c0d991cb98d38859d5249c43"} Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.242111 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4461b039e7271183fe79e2bb78ebe69951a87218c403777d823cbfaf86eaf579"} Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.243322 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.243834 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.244197 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.256188 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.260888 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.262567 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a" exitCode=0 Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.262594 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74" exitCode=0 Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.262604 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a" exitCode=0 Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.262612 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6" exitCode=2 Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.263077 4764 scope.go:117] "RemoveContainer" containerID="ff8f1319887e509029b45155a0f7d58a08769d1ece7adcf58acd9c5463567666" Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.264968 4764 generic.go:334] "Generic (PLEG): container finished" podID="d4da6599-c529-49c6-a409-e6abdec42a79" containerID="aaf70e68004ef94b046b3ba3c602025d765d74d331634e98a1fae16ba277966b" exitCode=0 Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.265192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d4da6599-c529-49c6-a409-e6abdec42a79","Type":"ContainerDied","Data":"aaf70e68004ef94b046b3ba3c602025d765d74d331634e98a1fae16ba277966b"} Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.266667 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.267291 4764 status_manager.go:851] "Failed to get status for pod" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.267851 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:24 crc kubenswrapper[4764]: I0320 14:56:24.268414 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:24 crc kubenswrapper[4764]: E0320 14:56:24.596240 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:56:24Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:56:24Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:56:24Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T14:56:24Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:24 crc kubenswrapper[4764]: E0320 14:56:24.596956 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:24 crc kubenswrapper[4764]: E0320 14:56:24.597742 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:24 crc kubenswrapper[4764]: E0320 14:56:24.598138 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:24 crc kubenswrapper[4764]: E0320 14:56:24.598497 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:24 crc kubenswrapper[4764]: E0320 14:56:24.598530 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.288505 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.593365 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.595162 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.596150 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.596840 4764 status_manager.go:851] "Failed to get status for pod" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.597526 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.597900 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.682405 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.682455 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.682569 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.682598 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.682672 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.682783 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.682960 4764 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.683005 4764 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.683036 4764 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.715870 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.716483 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.716782 4764 status_manager.go:851] "Failed to get status for pod" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.717228 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.717934 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.783848 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4da6599-c529-49c6-a409-e6abdec42a79-kube-api-access\") pod \"d4da6599-c529-49c6-a409-e6abdec42a79\" (UID: \"d4da6599-c529-49c6-a409-e6abdec42a79\") " Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.783956 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4da6599-c529-49c6-a409-e6abdec42a79-kubelet-dir\") pod \"d4da6599-c529-49c6-a409-e6abdec42a79\" (UID: \"d4da6599-c529-49c6-a409-e6abdec42a79\") " Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.784055 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d4da6599-c529-49c6-a409-e6abdec42a79-var-lock\") pod \"d4da6599-c529-49c6-a409-e6abdec42a79\" (UID: \"d4da6599-c529-49c6-a409-e6abdec42a79\") " Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.784176 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4da6599-c529-49c6-a409-e6abdec42a79-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d4da6599-c529-49c6-a409-e6abdec42a79" (UID: "d4da6599-c529-49c6-a409-e6abdec42a79"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.784336 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4da6599-c529-49c6-a409-e6abdec42a79-var-lock" (OuterVolumeSpecName: "var-lock") pod "d4da6599-c529-49c6-a409-e6abdec42a79" (UID: "d4da6599-c529-49c6-a409-e6abdec42a79"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.784594 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4da6599-c529-49c6-a409-e6abdec42a79-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.784627 4764 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d4da6599-c529-49c6-a409-e6abdec42a79-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.793629 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4da6599-c529-49c6-a409-e6abdec42a79-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d4da6599-c529-49c6-a409-e6abdec42a79" (UID: "d4da6599-c529-49c6-a409-e6abdec42a79"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:56:25 crc kubenswrapper[4764]: I0320 14:56:25.886686 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4da6599-c529-49c6-a409-e6abdec42a79-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.308126 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.309972 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be" exitCode=0 Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.310266 4764 scope.go:117] "RemoveContainer" containerID="6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.310351 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.314346 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d4da6599-c529-49c6-a409-e6abdec42a79","Type":"ContainerDied","Data":"5b5da596ef192c47caedb9a9e301083fdcb5ea10306b10150ba2943b470b4eba"} Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.314425 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b5da596ef192c47caedb9a9e301083fdcb5ea10306b10150ba2943b470b4eba" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.314503 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.346450 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.346891 4764 status_manager.go:851] "Failed to get status for pod" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.347221 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.347993 4764 scope.go:117] "RemoveContainer" containerID="4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.348238 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.349754 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.350252 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.350856 4764 status_manager.go:851] "Failed to get status for pod" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.351269 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.374476 4764 scope.go:117] "RemoveContainer" containerID="c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.399750 4764 scope.go:117] "RemoveContainer" containerID="65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.432104 4764 scope.go:117] "RemoveContainer" containerID="596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.458815 4764 scope.go:117] "RemoveContainer" containerID="f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.494087 4764 scope.go:117] "RemoveContainer" containerID="6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a" Mar 20 14:56:26 crc kubenswrapper[4764]: E0320 14:56:26.494968 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\": container with ID starting with 6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a not found: ID does not exist" containerID="6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.495080 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a"} err="failed to get container status \"6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\": rpc error: code = NotFound desc = could not find container \"6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a\": container with ID starting with 6ef618aa1a1b148f8eda6cb633e931f4accdcf8d4fd2f27a322e567e8f4d087a not found: ID does not exist" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.495132 4764 scope.go:117] "RemoveContainer" containerID="4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74" Mar 20 14:56:26 crc kubenswrapper[4764]: E0320 14:56:26.496137 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\": container with ID starting with 4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74 not found: ID does not exist" containerID="4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.496188 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74"} err="failed to get container status \"4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\": rpc error: code = NotFound desc = could not find container \"4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74\": container with ID starting with 4d66d12f5ab3e2a9bc46eaeecb6bd45c526d2dcdd97d285f517fb5fa8d64ae74 not found: ID does not exist" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.496210 4764 scope.go:117] "RemoveContainer" containerID="c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a" Mar 20 14:56:26 crc kubenswrapper[4764]: E0320 14:56:26.497295 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\": container with ID starting with c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a not found: ID does not exist" containerID="c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.497403 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a"} err="failed to get container status \"c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\": rpc error: code = NotFound desc = could not find container \"c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a\": container with ID starting with c6cdbca447a28b1a15fa0b0c7710f6c400f1c8243585d677a85e9d33b6a7d18a not found: ID does not exist" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.497435 4764 scope.go:117] "RemoveContainer" containerID="65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6" Mar 20 14:56:26 crc kubenswrapper[4764]: E0320 14:56:26.497927 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\": container with ID starting with 65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6 not found: ID does not exist" containerID="65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.497947 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6"} err="failed to get container status \"65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\": rpc error: code = NotFound desc = could not find container \"65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6\": container with ID starting with 65bc4846bd97e71a7983208ae72f8ecc0b6c7b465152a8e920f001372352d0b6 not found: ID does not exist" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.497961 4764 scope.go:117] "RemoveContainer" containerID="596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be" Mar 20 14:56:26 crc kubenswrapper[4764]: E0320 14:56:26.498276 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\": container with ID starting with 596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be not found: ID does not exist" containerID="596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.498296 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be"} err="failed to get container status \"596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\": rpc error: code = NotFound desc = could not find container \"596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be\": container with ID starting with 596b53e9e85731414dc676fd3011dd4317c7e25918c874b78ccf5846616b70be not found: ID does not exist" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.498309 4764 scope.go:117] "RemoveContainer" containerID="f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe" Mar 20 14:56:26 crc kubenswrapper[4764]: E0320 14:56:26.498987 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\": container with ID starting with f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe not found: ID does not exist" containerID="f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe" Mar 20 14:56:26 crc kubenswrapper[4764]: I0320 14:56:26.499009 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe"} err="failed to get container status \"f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\": rpc error: code = NotFound desc = could not find container \"f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe\": container with ID starting with f2638d3d74f6ecda3e13502deda2032172e521ffd1944058fd72df62cbf6eafe not found: ID does not exist" Mar 20 14:56:27 crc kubenswrapper[4764]: I0320 14:56:27.132889 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 14:56:29 crc kubenswrapper[4764]: I0320 14:56:29.130909 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:29 crc kubenswrapper[4764]: I0320 14:56:29.131763 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:29 crc kubenswrapper[4764]: I0320 14:56:29.132325 4764 status_manager.go:851] "Failed to get status for pod" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:33 crc kubenswrapper[4764]: E0320 14:56:33.102273 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.64:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9480aad94f68 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 14:56:23.573647208 +0000 UTC m=+305.189836347,LastTimestamp:2026-03-20 14:56:23.573647208 +0000 UTC m=+305.189836347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 14:56:33 crc kubenswrapper[4764]: E0320 14:56:33.417953 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:33 crc kubenswrapper[4764]: E0320 14:56:33.418940 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:33 crc kubenswrapper[4764]: E0320 14:56:33.419574 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:33 crc kubenswrapper[4764]: E0320 14:56:33.420140 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:33 crc kubenswrapper[4764]: E0320 14:56:33.420717 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:33 crc kubenswrapper[4764]: I0320 14:56:33.420797 4764 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 14:56:33 crc kubenswrapper[4764]: E0320 14:56:33.421441 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="200ms" Mar 20 14:56:33 crc kubenswrapper[4764]: E0320 14:56:33.623128 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="400ms" Mar 20 14:56:34 crc kubenswrapper[4764]: E0320 14:56:34.023737 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="800ms" Mar 20 14:56:34 crc kubenswrapper[4764]: E0320 14:56:34.213252 4764 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.64:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" volumeName="registry-storage" Mar 20 14:56:34 crc kubenswrapper[4764]: E0320 14:56:34.825205 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="1.6s" Mar 20 14:56:36 crc kubenswrapper[4764]: E0320 14:56:36.427107 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="3.2s" Mar 20 14:56:37 crc kubenswrapper[4764]: I0320 14:56:37.241423 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 14:56:37 crc kubenswrapper[4764]: I0320 14:56:37.241481 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 14:56:37 crc kubenswrapper[4764]: I0320 14:56:37.404946 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 14:56:37 crc kubenswrapper[4764]: I0320 14:56:37.406326 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 14:56:37 crc kubenswrapper[4764]: I0320 14:56:37.406450 4764 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31" exitCode=1 Mar 20 14:56:37 crc kubenswrapper[4764]: I0320 14:56:37.406501 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31"} Mar 20 14:56:37 crc kubenswrapper[4764]: I0320 14:56:37.407246 4764 scope.go:117] "RemoveContainer" containerID="3bf3cd6ccfc5e46f91899d3d8bfc27abca34a141133129d28cfc1f92e6005f31" Mar 20 14:56:37 crc kubenswrapper[4764]: I0320 14:56:37.407568 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:37 crc kubenswrapper[4764]: I0320 14:56:37.408671 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:37 crc kubenswrapper[4764]: I0320 14:56:37.409063 4764 status_manager.go:851] "Failed to get status for pod" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:37 crc kubenswrapper[4764]: I0320 14:56:37.409365 4764 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.125781 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.126766 4764 status_manager.go:851] "Failed to get status for pod" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.127349 4764 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.127785 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.128093 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.142469 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa943b85-f686-4d41-8822-f29c0a6defdf" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.142495 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa943b85-f686-4d41-8822-f29c0a6defdf" Mar 20 14:56:38 crc kubenswrapper[4764]: E0320 14:56:38.142919 4764 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.143692 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:38 crc kubenswrapper[4764]: W0320 14:56:38.165086 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-10fce665bdc4b4a8db0586f2cc5002c1e8a6c03fb97890019d5906a2a0de209b WatchSource:0}: Error finding container 10fce665bdc4b4a8db0586f2cc5002c1e8a6c03fb97890019d5906a2a0de209b: Status 404 returned error can't find the container with id 10fce665bdc4b4a8db0586f2cc5002c1e8a6c03fb97890019d5906a2a0de209b Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.420269 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.421237 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.421364 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4ed1e11833bfad8ba10ee183e510208ded8aee7835137f11c75fe56452815715"} Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.422408 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.422896 4764 status_manager.go:851] "Failed to get status for pod" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.423516 4764 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.423753 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:38 crc kubenswrapper[4764]: I0320 14:56:38.423720 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"10fce665bdc4b4a8db0586f2cc5002c1e8a6c03fb97890019d5906a2a0de209b"} Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.137765 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.138433 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.138932 4764 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.139283 4764 status_manager.go:851] "Failed to get status for pod" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.139676 4764 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.434581 4764 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="38e6feae9c3eadb51af2829efd963efb3a4779002c4df367409664e42e64689e" exitCode=0 Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.434652 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"38e6feae9c3eadb51af2829efd963efb3a4779002c4df367409664e42e64689e"} Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.435023 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa943b85-f686-4d41-8822-f29c0a6defdf" Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.435061 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa943b85-f686-4d41-8822-f29c0a6defdf" Mar 20 14:56:39 crc kubenswrapper[4764]: E0320 14:56:39.435680 4764 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.436012 4764 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.436687 4764 status_manager.go:851] "Failed to get status for pod" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.437346 4764 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.438169 4764 status_manager.go:851] "Failed to get status for pod" podUID="7f1c734f-cf29-446c-8f39-9fa4b7575654" pod="openshift-controller-manager/controller-manager-5999d99c6c-gskr2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5999d99c6c-gskr2\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:39 crc kubenswrapper[4764]: I0320 14:56:39.438682 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.64:6443: connect: connection refused" Mar 20 14:56:39 crc kubenswrapper[4764]: E0320 14:56:39.628449 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.64:6443: connect: connection refused" interval="6.4s" Mar 20 14:56:40 crc kubenswrapper[4764]: I0320 14:56:40.447637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"72e3d8391ef0c8658fe56ddb2bbadfdb2ddb85a89e9e21f3fd5f513bb1576e6c"} Mar 20 14:56:40 crc kubenswrapper[4764]: I0320 14:56:40.447909 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"74331926c35d736361b05aa51a7340b36a77e01c5145478851262cc35cbe9711"} Mar 20 14:56:40 crc kubenswrapper[4764]: I0320 14:56:40.447919 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d0eef70b67683c94916808d39ed0833d92204f1aa33045cde0768f274b6baec7"} Mar 20 14:56:41 crc kubenswrapper[4764]: I0320 14:56:41.342109 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:56:41 crc kubenswrapper[4764]: I0320 14:56:41.342208 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 14:56:41 crc kubenswrapper[4764]: I0320 14:56:41.342269 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 14:56:41 crc kubenswrapper[4764]: I0320 14:56:41.456300 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f764aedcea7eb6c01965026d39509effdeb0fe473d225f5e5a79d387c89819de"} Mar 20 14:56:41 crc kubenswrapper[4764]: I0320 14:56:41.456340 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0631b1da1b025a73fe46d2399fdd61124b0acfc4a05f2fedcf03c0bc06ed7f7f"} Mar 20 14:56:41 crc kubenswrapper[4764]: I0320 14:56:41.456604 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa943b85-f686-4d41-8822-f29c0a6defdf" Mar 20 14:56:41 crc kubenswrapper[4764]: I0320 14:56:41.456619 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa943b85-f686-4d41-8822-f29c0a6defdf" Mar 20 14:56:41 crc kubenswrapper[4764]: I0320 14:56:41.456621 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:43 crc kubenswrapper[4764]: I0320 14:56:43.144033 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:43 crc kubenswrapper[4764]: I0320 14:56:43.144542 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:43 crc kubenswrapper[4764]: I0320 14:56:43.153010 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:46 crc kubenswrapper[4764]: I0320 14:56:46.477725 4764 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:47 crc kubenswrapper[4764]: I0320 14:56:47.240622 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:56:47 crc kubenswrapper[4764]: I0320 14:56:47.493473 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa943b85-f686-4d41-8822-f29c0a6defdf" Mar 20 14:56:47 crc kubenswrapper[4764]: I0320 14:56:47.493530 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa943b85-f686-4d41-8822-f29c0a6defdf" Mar 20 14:56:47 crc kubenswrapper[4764]: I0320 14:56:47.499993 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:47 crc kubenswrapper[4764]: I0320 14:56:47.503601 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4018632a-4e93-4e85-bfbc-6e9a5c698bcf" Mar 20 14:56:48 crc kubenswrapper[4764]: I0320 14:56:48.097443 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:56:48 crc kubenswrapper[4764]: I0320 14:56:48.100560 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 14:56:48 crc kubenswrapper[4764]: I0320 14:56:48.109417 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:56:48 crc kubenswrapper[4764]: I0320 14:56:48.297159 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 14:56:48 crc kubenswrapper[4764]: I0320 14:56:48.502667 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa943b85-f686-4d41-8822-f29c0a6defdf" Mar 20 14:56:48 crc kubenswrapper[4764]: I0320 14:56:48.503157 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aa943b85-f686-4d41-8822-f29c0a6defdf" Mar 20 14:56:48 crc kubenswrapper[4764]: I0320 14:56:48.509014 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4018632a-4e93-4e85-bfbc-6e9a5c698bcf" Mar 20 14:56:49 crc kubenswrapper[4764]: I0320 14:56:49.511307 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"af535dd1a36b4fbc62678667dcabcdeea1ab298da04d026f8b957b8af57b1186"} Mar 20 14:56:49 crc kubenswrapper[4764]: I0320 14:56:49.511753 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f814d59197ae779fdf53e027e852f402455792da20387fe93763a71ecd5df191"} Mar 20 14:56:51 crc kubenswrapper[4764]: I0320 14:56:51.340776 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 14:56:51 crc kubenswrapper[4764]: I0320 14:56:51.340920 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 14:56:55 crc kubenswrapper[4764]: I0320 14:56:55.553816 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 14:56:55 crc kubenswrapper[4764]: I0320 14:56:55.865154 4764 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 14:56:55 crc kubenswrapper[4764]: I0320 14:56:55.873066 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=32.873012784 podStartE2EDuration="32.873012784s" podCreationTimestamp="2026-03-20 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:56:46.539317842 +0000 UTC m=+328.155506981" watchObservedRunningTime="2026-03-20 14:56:55.873012784 +0000 UTC m=+337.489201953" Mar 20 14:56:55 crc kubenswrapper[4764]: I0320 14:56:55.874896 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 14:56:55 crc kubenswrapper[4764]: I0320 14:56:55.874968 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 14:56:55 crc kubenswrapper[4764]: I0320 14:56:55.884129 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 14:56:55 crc kubenswrapper[4764]: I0320 14:56:55.906684 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=9.906661475 podStartE2EDuration="9.906661475s" podCreationTimestamp="2026-03-20 14:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:56:55.900548545 +0000 UTC m=+337.516737684" watchObservedRunningTime="2026-03-20 14:56:55.906661475 +0000 UTC m=+337.522850644" Mar 20 14:56:56 crc kubenswrapper[4764]: I0320 14:56:56.564655 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 14:56:56 crc kubenswrapper[4764]: I0320 14:56:56.811765 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 14:56:56 crc kubenswrapper[4764]: I0320 14:56:56.859660 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 14:56:56 crc kubenswrapper[4764]: I0320 14:56:56.979343 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.170212 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.225162 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.233984 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.278584 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.397694 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.472594 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.605288 4764 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.605655 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://9fa7b788a18d9e903bc97a714978615016c12c52c0d991cb98d38859d5249c43" gracePeriod=5 Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.701185 4764 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.790087 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.864282 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.965456 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 14:56:57 crc kubenswrapper[4764]: I0320 14:56:57.967461 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 14:56:58 crc kubenswrapper[4764]: I0320 14:56:58.144620 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 14:56:58 crc kubenswrapper[4764]: I0320 14:56:58.252606 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 14:56:58 crc kubenswrapper[4764]: I0320 14:56:58.463351 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 14:56:58 crc kubenswrapper[4764]: I0320 14:56:58.871906 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.186032 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.249438 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.302301 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.363318 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.367667 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.389271 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.428176 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.618280 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.701963 4764 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.804806 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.813721 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.829611 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 14:56:59 crc kubenswrapper[4764]: I0320 14:56:59.861556 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.025417 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.043581 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.066203 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.122076 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.176272 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.237937 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.313601 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.319850 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.332057 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.399121 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.624028 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.745403 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 14:57:00 crc kubenswrapper[4764]: I0320 14:57:00.872953 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.016112 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.057301 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.171105 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.188500 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.234663 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.248517 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.290188 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.291174 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.348211 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.350252 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.356632 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.495997 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.556626 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.677429 4764 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.718402 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.813568 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.848145 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.865314 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 14:57:01 crc kubenswrapper[4764]: I0320 14:57:01.905369 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.037752 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.097832 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.114935 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.188689 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.238218 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.255287 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.350566 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.530314 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.767203 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.790325 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.790440 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.794417 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.807892 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.852866 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.906854 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.906969 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.906999 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.907022 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.907072 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.907130 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.907178 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.907281 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.907424 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.907765 4764 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.907794 4764 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.907810 4764 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.907827 4764 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.910408 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.918706 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:57:02 crc kubenswrapper[4764]: I0320 14:57:02.926953 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.009366 4764 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.076683 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.137225 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.137685 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.149906 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.153286 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.153329 4764 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="2a12d2b8-a008-49fa-aa03-9c5e143a1482" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.161206 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.161266 4764 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="2a12d2b8-a008-49fa-aa03-9c5e143a1482" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.162572 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.312262 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.314583 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.350295 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.378145 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.454626 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.570634 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.582541 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.598655 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.629133 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.629178 4764 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="9fa7b788a18d9e903bc97a714978615016c12c52c0d991cb98d38859d5249c43" exitCode=137 Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.629220 4764 scope.go:117] "RemoveContainer" containerID="9fa7b788a18d9e903bc97a714978615016c12c52c0d991cb98d38859d5249c43" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.629329 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.646974 4764 scope.go:117] "RemoveContainer" containerID="9fa7b788a18d9e903bc97a714978615016c12c52c0d991cb98d38859d5249c43" Mar 20 14:57:03 crc kubenswrapper[4764]: E0320 14:57:03.647588 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa7b788a18d9e903bc97a714978615016c12c52c0d991cb98d38859d5249c43\": container with ID starting with 9fa7b788a18d9e903bc97a714978615016c12c52c0d991cb98d38859d5249c43 not found: ID does not exist" containerID="9fa7b788a18d9e903bc97a714978615016c12c52c0d991cb98d38859d5249c43" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.647665 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa7b788a18d9e903bc97a714978615016c12c52c0d991cb98d38859d5249c43"} err="failed to get container status \"9fa7b788a18d9e903bc97a714978615016c12c52c0d991cb98d38859d5249c43\": rpc error: code = NotFound desc = could not find container \"9fa7b788a18d9e903bc97a714978615016c12c52c0d991cb98d38859d5249c43\": container with ID starting with 9fa7b788a18d9e903bc97a714978615016c12c52c0d991cb98d38859d5249c43 not found: ID does not exist" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.715486 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.728696 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.740783 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.750032 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.750496 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.791472 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.797139 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.835980 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.873609 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.926113 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 14:57:03 crc kubenswrapper[4764]: I0320 14:57:03.927643 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.081190 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.146144 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.186262 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.228951 4764 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.229178 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.252646 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.259696 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.277703 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.329011 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.418501 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.448022 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.548114 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.555714 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.577131 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.651187 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.651521 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.653220 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.714361 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.723943 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.795863 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.810215 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 14:57:04 crc kubenswrapper[4764]: I0320 14:57:04.989327 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.006912 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.013997 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.101196 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.114222 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.139164 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.167104 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.213310 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.254988 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.278065 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.533639 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.534777 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.553451 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.554247 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.570599 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.648476 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.723286 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.773780 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.861965 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.908869 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.950441 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 14:57:05 crc kubenswrapper[4764]: I0320 14:57:05.995565 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.006480 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.035312 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.060899 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.062165 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.085410 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.114171 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.116221 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.116309 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.142628 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.188857 4764 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.222611 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.236909 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.317202 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.329133 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.535838 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.555157 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.557490 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.579740 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.599636 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.742169 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.766903 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.800674 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 14:57:06 crc kubenswrapper[4764]: I0320 14:57:06.997729 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.016959 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.069883 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.185706 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.255752 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.301551 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.311974 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.317051 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.318178 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.356427 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.389207 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.405771 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.442161 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.874846 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.901703 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.928414 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.979347 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 14:57:07 crc kubenswrapper[4764]: I0320 14:57:07.991770 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.039154 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.084057 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.096883 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.167923 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.183631 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.304499 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.407937 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.651303 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.676788 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.737333 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.759200 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.774552 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 14:57:08 crc kubenswrapper[4764]: I0320 14:57:08.839917 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.001803 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.032198 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.034174 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.090930 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.091991 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.158847 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.216409 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.368221 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.371850 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.410309 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.572643 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.599929 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.685368 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.715789 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.725550 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.725667 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.771273 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 14:57:09 crc kubenswrapper[4764]: I0320 14:57:09.961600 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.092094 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.181255 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.263272 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.312227 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.507768 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.546043 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.607903 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.620541 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.642152 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.739927 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.833683 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.912733 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.963035 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.967113 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 14:57:10 crc kubenswrapper[4764]: I0320 14:57:10.998424 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 14:57:11 crc kubenswrapper[4764]: I0320 14:57:11.077629 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 14:57:11 crc kubenswrapper[4764]: I0320 14:57:11.102666 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 14:57:11 crc kubenswrapper[4764]: I0320 14:57:11.106024 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 14:57:11 crc kubenswrapper[4764]: I0320 14:57:11.410154 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 14:57:11 crc kubenswrapper[4764]: I0320 14:57:11.446630 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 14:57:11 crc kubenswrapper[4764]: I0320 14:57:11.661133 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 14:57:11 crc kubenswrapper[4764]: I0320 14:57:11.662645 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 14:57:11 crc kubenswrapper[4764]: I0320 14:57:11.914477 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 14:57:12 crc kubenswrapper[4764]: I0320 14:57:12.407841 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 14:57:12 crc kubenswrapper[4764]: I0320 14:57:12.818348 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 14:57:23 crc kubenswrapper[4764]: I0320 14:57:23.922112 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 14:57:27 crc kubenswrapper[4764]: I0320 14:57:27.090535 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 14:57:30 crc kubenswrapper[4764]: I0320 14:57:30.445335 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 14:57:31 crc kubenswrapper[4764]: I0320 14:57:31.809896 4764 generic.go:334] "Generic (PLEG): container finished" podID="143a8092-b930-4ffb-8414-eda1d808fb8c" containerID="fe3d59b4dd5fb1167dcb24814c525c24bc646126092a919a5c0be4be171f0bca" exitCode=0 Mar 20 14:57:31 crc kubenswrapper[4764]: I0320 14:57:31.809956 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" event={"ID":"143a8092-b930-4ffb-8414-eda1d808fb8c","Type":"ContainerDied","Data":"fe3d59b4dd5fb1167dcb24814c525c24bc646126092a919a5c0be4be171f0bca"} Mar 20 14:57:31 crc kubenswrapper[4764]: I0320 14:57:31.810733 4764 scope.go:117] "RemoveContainer" containerID="fe3d59b4dd5fb1167dcb24814c525c24bc646126092a919a5c0be4be171f0bca" Mar 20 14:57:32 crc kubenswrapper[4764]: I0320 14:57:32.819569 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" event={"ID":"143a8092-b930-4ffb-8414-eda1d808fb8c","Type":"ContainerStarted","Data":"0d0fbd440c8296db8ad3a9866cb5b928779aa02b85d6f9bff1e43e93ad722114"} Mar 20 14:57:32 crc kubenswrapper[4764]: I0320 14:57:32.820865 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:57:32 crc kubenswrapper[4764]: I0320 14:57:32.824130 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:57:32 crc kubenswrapper[4764]: I0320 14:57:32.931929 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 14:57:33 crc kubenswrapper[4764]: I0320 14:57:33.307021 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 14:57:33 crc kubenswrapper[4764]: I0320 14:57:33.675027 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 14:57:34 crc kubenswrapper[4764]: I0320 14:57:34.800562 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 14:57:38 crc kubenswrapper[4764]: I0320 14:57:38.065333 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 14:57:43 crc kubenswrapper[4764]: I0320 14:57:43.489029 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 14:57:49 crc kubenswrapper[4764]: I0320 14:57:49.185526 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 14:57:50 crc kubenswrapper[4764]: I0320 14:57:50.005625 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 14:57:50 crc kubenswrapper[4764]: I0320 14:57:50.293999 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.193418 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566978-kl2wd"] Mar 20 14:58:00 crc kubenswrapper[4764]: E0320 14:58:00.194581 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.194602 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 14:58:00 crc kubenswrapper[4764]: E0320 14:58:00.194624 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" containerName="installer" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.194639 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" containerName="installer" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.194811 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.194846 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4da6599-c529-49c6-a409-e6abdec42a79" containerName="installer" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.195526 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566978-kl2wd" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.197952 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.198482 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.198495 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.216340 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566978-kl2wd"] Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.286904 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb84v\" (UniqueName: \"kubernetes.io/projected/bc39d516-80da-4091-842b-2bcef48bcc57-kube-api-access-kb84v\") pod \"auto-csr-approver-29566978-kl2wd\" (UID: \"bc39d516-80da-4091-842b-2bcef48bcc57\") " pod="openshift-infra/auto-csr-approver-29566978-kl2wd" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.389045 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb84v\" (UniqueName: \"kubernetes.io/projected/bc39d516-80da-4091-842b-2bcef48bcc57-kube-api-access-kb84v\") pod \"auto-csr-approver-29566978-kl2wd\" (UID: \"bc39d516-80da-4091-842b-2bcef48bcc57\") " pod="openshift-infra/auto-csr-approver-29566978-kl2wd" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.415744 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb84v\" (UniqueName: \"kubernetes.io/projected/bc39d516-80da-4091-842b-2bcef48bcc57-kube-api-access-kb84v\") pod \"auto-csr-approver-29566978-kl2wd\" (UID: \"bc39d516-80da-4091-842b-2bcef48bcc57\") " pod="openshift-infra/auto-csr-approver-29566978-kl2wd" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.529762 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566978-kl2wd" Mar 20 14:58:00 crc kubenswrapper[4764]: I0320 14:58:00.802546 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566978-kl2wd"] Mar 20 14:58:01 crc kubenswrapper[4764]: I0320 14:58:01.008141 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566978-kl2wd" event={"ID":"bc39d516-80da-4091-842b-2bcef48bcc57","Type":"ContainerStarted","Data":"335102ed11f3e76dcb2a4f216d22ebfeea950cbd09ff912a94d8bdf1d5043de1"} Mar 20 14:58:03 crc kubenswrapper[4764]: I0320 14:58:03.023899 4764 generic.go:334] "Generic (PLEG): container finished" podID="bc39d516-80da-4091-842b-2bcef48bcc57" containerID="a89ce12233e8c868c31745aaf4798551a59ff9219e07e009b78099d9eaa24730" exitCode=0 Mar 20 14:58:03 crc kubenswrapper[4764]: I0320 14:58:03.023971 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566978-kl2wd" event={"ID":"bc39d516-80da-4091-842b-2bcef48bcc57","Type":"ContainerDied","Data":"a89ce12233e8c868c31745aaf4798551a59ff9219e07e009b78099d9eaa24730"} Mar 20 14:58:04 crc kubenswrapper[4764]: I0320 14:58:04.410726 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566978-kl2wd" Mar 20 14:58:04 crc kubenswrapper[4764]: I0320 14:58:04.553130 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb84v\" (UniqueName: \"kubernetes.io/projected/bc39d516-80da-4091-842b-2bcef48bcc57-kube-api-access-kb84v\") pod \"bc39d516-80da-4091-842b-2bcef48bcc57\" (UID: \"bc39d516-80da-4091-842b-2bcef48bcc57\") " Mar 20 14:58:04 crc kubenswrapper[4764]: I0320 14:58:04.564328 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc39d516-80da-4091-842b-2bcef48bcc57-kube-api-access-kb84v" (OuterVolumeSpecName: "kube-api-access-kb84v") pod "bc39d516-80da-4091-842b-2bcef48bcc57" (UID: "bc39d516-80da-4091-842b-2bcef48bcc57"). InnerVolumeSpecName "kube-api-access-kb84v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:58:04 crc kubenswrapper[4764]: I0320 14:58:04.654864 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb84v\" (UniqueName: \"kubernetes.io/projected/bc39d516-80da-4091-842b-2bcef48bcc57-kube-api-access-kb84v\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:05 crc kubenswrapper[4764]: I0320 14:58:05.039654 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566978-kl2wd" event={"ID":"bc39d516-80da-4091-842b-2bcef48bcc57","Type":"ContainerDied","Data":"335102ed11f3e76dcb2a4f216d22ebfeea950cbd09ff912a94d8bdf1d5043de1"} Mar 20 14:58:05 crc kubenswrapper[4764]: I0320 14:58:05.039710 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="335102ed11f3e76dcb2a4f216d22ebfeea950cbd09ff912a94d8bdf1d5043de1" Mar 20 14:58:05 crc kubenswrapper[4764]: I0320 14:58:05.039778 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566978-kl2wd" Mar 20 14:58:08 crc kubenswrapper[4764]: I0320 14:58:08.443256 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:58:08 crc kubenswrapper[4764]: I0320 14:58:08.443748 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:58:38 crc kubenswrapper[4764]: I0320 14:58:38.443724 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:58:38 crc kubenswrapper[4764]: I0320 14:58:38.444545 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.795100 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7m8j"] Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.795942 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f7m8j" podUID="548554c3-21d2-4406-a509-e80303628f56" containerName="registry-server" containerID="cri-o://59150e1e0d63536c0f51e2f90ac24585b123050217841fefc6b6cebc3b3e6c70" gracePeriod=30 Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.810747 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cmxv2"] Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.811038 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cmxv2" podUID="67e76e77-4199-4fdd-b755-10cab62e1370" containerName="registry-server" containerID="cri-o://e90443574f4f719172f02718115464a18cce41db48c9e44c1aa44193f1c468a8" gracePeriod=30 Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.826494 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hkbns"] Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.826789 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" podUID="143a8092-b930-4ffb-8414-eda1d808fb8c" containerName="marketplace-operator" containerID="cri-o://0d0fbd440c8296db8ad3a9866cb5b928779aa02b85d6f9bff1e43e93ad722114" gracePeriod=30 Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.837006 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-44fr6"] Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.837356 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-44fr6" podUID="830768c7-49e2-4ed5-af8e-3762dc00534e" containerName="registry-server" containerID="cri-o://5a074b1406ab1ac382f7073b289cc8c6fe55630339d2f797bfa5c15530b7afc5" gracePeriod=30 Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.844775 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fzfpf"] Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.845079 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fzfpf" podUID="4a051746-92b7-4a16-a641-d73888dcfcca" containerName="registry-server" containerID="cri-o://3942a276f8cb67ec8c9ec580140afcfecb1fe16ad139d8551771d3167d0896f5" gracePeriod=30 Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.871971 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgkcv"] Mar 20 14:58:39 crc kubenswrapper[4764]: E0320 14:58:39.872226 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc39d516-80da-4091-842b-2bcef48bcc57" containerName="oc" Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.872241 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc39d516-80da-4091-842b-2bcef48bcc57" containerName="oc" Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.872402 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc39d516-80da-4091-842b-2bcef48bcc57" containerName="oc" Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.872849 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.882973 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgkcv"] Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.994407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8cs5\" (UniqueName: \"kubernetes.io/projected/ad389133-cb52-4ed9-9261-daeb7a5cb13e-kube-api-access-v8cs5\") pod \"marketplace-operator-79b997595-wgkcv\" (UID: \"ad389133-cb52-4ed9-9261-daeb7a5cb13e\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.994489 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad389133-cb52-4ed9-9261-daeb7a5cb13e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wgkcv\" (UID: \"ad389133-cb52-4ed9-9261-daeb7a5cb13e\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:39 crc kubenswrapper[4764]: I0320 14:58:39.994548 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ad389133-cb52-4ed9-9261-daeb7a5cb13e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wgkcv\" (UID: \"ad389133-cb52-4ed9-9261-daeb7a5cb13e\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.095096 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad389133-cb52-4ed9-9261-daeb7a5cb13e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wgkcv\" (UID: \"ad389133-cb52-4ed9-9261-daeb7a5cb13e\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.095176 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ad389133-cb52-4ed9-9261-daeb7a5cb13e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wgkcv\" (UID: \"ad389133-cb52-4ed9-9261-daeb7a5cb13e\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.095204 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8cs5\" (UniqueName: \"kubernetes.io/projected/ad389133-cb52-4ed9-9261-daeb7a5cb13e-kube-api-access-v8cs5\") pod \"marketplace-operator-79b997595-wgkcv\" (UID: \"ad389133-cb52-4ed9-9261-daeb7a5cb13e\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.096786 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad389133-cb52-4ed9-9261-daeb7a5cb13e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wgkcv\" (UID: \"ad389133-cb52-4ed9-9261-daeb7a5cb13e\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.109500 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ad389133-cb52-4ed9-9261-daeb7a5cb13e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wgkcv\" (UID: \"ad389133-cb52-4ed9-9261-daeb7a5cb13e\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.112251 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8cs5\" (UniqueName: \"kubernetes.io/projected/ad389133-cb52-4ed9-9261-daeb7a5cb13e-kube-api-access-v8cs5\") pod \"marketplace-operator-79b997595-wgkcv\" (UID: \"ad389133-cb52-4ed9-9261-daeb7a5cb13e\") " pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.316085 4764 generic.go:334] "Generic (PLEG): container finished" podID="830768c7-49e2-4ed5-af8e-3762dc00534e" containerID="5a074b1406ab1ac382f7073b289cc8c6fe55630339d2f797bfa5c15530b7afc5" exitCode=0 Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.316263 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44fr6" event={"ID":"830768c7-49e2-4ed5-af8e-3762dc00534e","Type":"ContainerDied","Data":"5a074b1406ab1ac382f7073b289cc8c6fe55630339d2f797bfa5c15530b7afc5"} Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.316739 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44fr6" event={"ID":"830768c7-49e2-4ed5-af8e-3762dc00534e","Type":"ContainerDied","Data":"07dfdebd1cc7c6971276329cbe403c743b9960a69ae94759aa4101b53c960564"} Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.316816 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07dfdebd1cc7c6971276329cbe403c743b9960a69ae94759aa4101b53c960564" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.318726 4764 generic.go:334] "Generic (PLEG): container finished" podID="548554c3-21d2-4406-a509-e80303628f56" containerID="59150e1e0d63536c0f51e2f90ac24585b123050217841fefc6b6cebc3b3e6c70" exitCode=0 Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.318834 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7m8j" event={"ID":"548554c3-21d2-4406-a509-e80303628f56","Type":"ContainerDied","Data":"59150e1e0d63536c0f51e2f90ac24585b123050217841fefc6b6cebc3b3e6c70"} Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.318908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7m8j" event={"ID":"548554c3-21d2-4406-a509-e80303628f56","Type":"ContainerDied","Data":"8b566261c1565e366c658c76912f950d7dba469f4d16b3c42932121ace0d9b08"} Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.319027 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b566261c1565e366c658c76912f950d7dba469f4d16b3c42932121ace0d9b08" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.322228 4764 generic.go:334] "Generic (PLEG): container finished" podID="4a051746-92b7-4a16-a641-d73888dcfcca" containerID="3942a276f8cb67ec8c9ec580140afcfecb1fe16ad139d8551771d3167d0896f5" exitCode=0 Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.322329 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzfpf" event={"ID":"4a051746-92b7-4a16-a641-d73888dcfcca","Type":"ContainerDied","Data":"3942a276f8cb67ec8c9ec580140afcfecb1fe16ad139d8551771d3167d0896f5"} Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.323750 4764 generic.go:334] "Generic (PLEG): container finished" podID="143a8092-b930-4ffb-8414-eda1d808fb8c" containerID="0d0fbd440c8296db8ad3a9866cb5b928779aa02b85d6f9bff1e43e93ad722114" exitCode=0 Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.323857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" event={"ID":"143a8092-b930-4ffb-8414-eda1d808fb8c","Type":"ContainerDied","Data":"0d0fbd440c8296db8ad3a9866cb5b928779aa02b85d6f9bff1e43e93ad722114"} Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.323934 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" event={"ID":"143a8092-b930-4ffb-8414-eda1d808fb8c","Type":"ContainerDied","Data":"fbb676f2ba016ddba4c16ca445139044816d2b411509ced4aa5aa17c4da13e01"} Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.323998 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbb676f2ba016ddba4c16ca445139044816d2b411509ced4aa5aa17c4da13e01" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.324060 4764 scope.go:117] "RemoveContainer" containerID="fe3d59b4dd5fb1167dcb24814c525c24bc646126092a919a5c0be4be171f0bca" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.326270 4764 generic.go:334] "Generic (PLEG): container finished" podID="67e76e77-4199-4fdd-b755-10cab62e1370" containerID="e90443574f4f719172f02718115464a18cce41db48c9e44c1aa44193f1c468a8" exitCode=0 Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.326363 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmxv2" event={"ID":"67e76e77-4199-4fdd-b755-10cab62e1370","Type":"ContainerDied","Data":"e90443574f4f719172f02718115464a18cce41db48c9e44c1aa44193f1c468a8"} Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.326466 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmxv2" event={"ID":"67e76e77-4199-4fdd-b755-10cab62e1370","Type":"ContainerDied","Data":"e9ced8f9bfa238ed9cd19700969c06fa94fdd9f4877a29d594b3a7f8e46df451"} Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.326526 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9ced8f9bfa238ed9cd19700969c06fa94fdd9f4877a29d594b3a7f8e46df451" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.332622 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.335976 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.339543 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.344975 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.347588 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.502706 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e76e77-4199-4fdd-b755-10cab62e1370-utilities\") pod \"67e76e77-4199-4fdd-b755-10cab62e1370\" (UID: \"67e76e77-4199-4fdd-b755-10cab62e1370\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.502748 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e76e77-4199-4fdd-b755-10cab62e1370-catalog-content\") pod \"67e76e77-4199-4fdd-b755-10cab62e1370\" (UID: \"67e76e77-4199-4fdd-b755-10cab62e1370\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.502801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/830768c7-49e2-4ed5-af8e-3762dc00534e-catalog-content\") pod \"830768c7-49e2-4ed5-af8e-3762dc00534e\" (UID: \"830768c7-49e2-4ed5-af8e-3762dc00534e\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.502829 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n76fb\" (UniqueName: \"kubernetes.io/projected/548554c3-21d2-4406-a509-e80303628f56-kube-api-access-n76fb\") pod \"548554c3-21d2-4406-a509-e80303628f56\" (UID: \"548554c3-21d2-4406-a509-e80303628f56\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.502855 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/548554c3-21d2-4406-a509-e80303628f56-catalog-content\") pod \"548554c3-21d2-4406-a509-e80303628f56\" (UID: \"548554c3-21d2-4406-a509-e80303628f56\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.502877 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb8r2\" (UniqueName: \"kubernetes.io/projected/67e76e77-4199-4fdd-b755-10cab62e1370-kube-api-access-pb8r2\") pod \"67e76e77-4199-4fdd-b755-10cab62e1370\" (UID: \"67e76e77-4199-4fdd-b755-10cab62e1370\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.502903 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/143a8092-b930-4ffb-8414-eda1d808fb8c-marketplace-trusted-ca\") pod \"143a8092-b930-4ffb-8414-eda1d808fb8c\" (UID: \"143a8092-b930-4ffb-8414-eda1d808fb8c\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.502956 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/548554c3-21d2-4406-a509-e80303628f56-utilities\") pod \"548554c3-21d2-4406-a509-e80303628f56\" (UID: \"548554c3-21d2-4406-a509-e80303628f56\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.502976 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj459\" (UniqueName: \"kubernetes.io/projected/830768c7-49e2-4ed5-af8e-3762dc00534e-kube-api-access-dj459\") pod \"830768c7-49e2-4ed5-af8e-3762dc00534e\" (UID: \"830768c7-49e2-4ed5-af8e-3762dc00534e\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.503009 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/830768c7-49e2-4ed5-af8e-3762dc00534e-utilities\") pod \"830768c7-49e2-4ed5-af8e-3762dc00534e\" (UID: \"830768c7-49e2-4ed5-af8e-3762dc00534e\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.503030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/143a8092-b930-4ffb-8414-eda1d808fb8c-marketplace-operator-metrics\") pod \"143a8092-b930-4ffb-8414-eda1d808fb8c\" (UID: \"143a8092-b930-4ffb-8414-eda1d808fb8c\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.503054 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl6v9\" (UniqueName: \"kubernetes.io/projected/143a8092-b930-4ffb-8414-eda1d808fb8c-kube-api-access-rl6v9\") pod \"143a8092-b930-4ffb-8414-eda1d808fb8c\" (UID: \"143a8092-b930-4ffb-8414-eda1d808fb8c\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.504569 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/830768c7-49e2-4ed5-af8e-3762dc00534e-utilities" (OuterVolumeSpecName: "utilities") pod "830768c7-49e2-4ed5-af8e-3762dc00534e" (UID: "830768c7-49e2-4ed5-af8e-3762dc00534e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.504633 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67e76e77-4199-4fdd-b755-10cab62e1370-utilities" (OuterVolumeSpecName: "utilities") pod "67e76e77-4199-4fdd-b755-10cab62e1370" (UID: "67e76e77-4199-4fdd-b755-10cab62e1370"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.504664 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/548554c3-21d2-4406-a509-e80303628f56-utilities" (OuterVolumeSpecName: "utilities") pod "548554c3-21d2-4406-a509-e80303628f56" (UID: "548554c3-21d2-4406-a509-e80303628f56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.505187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143a8092-b930-4ffb-8414-eda1d808fb8c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "143a8092-b930-4ffb-8414-eda1d808fb8c" (UID: "143a8092-b930-4ffb-8414-eda1d808fb8c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.508530 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/830768c7-49e2-4ed5-af8e-3762dc00534e-kube-api-access-dj459" (OuterVolumeSpecName: "kube-api-access-dj459") pod "830768c7-49e2-4ed5-af8e-3762dc00534e" (UID: "830768c7-49e2-4ed5-af8e-3762dc00534e"). InnerVolumeSpecName "kube-api-access-dj459". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.509502 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e76e77-4199-4fdd-b755-10cab62e1370-kube-api-access-pb8r2" (OuterVolumeSpecName: "kube-api-access-pb8r2") pod "67e76e77-4199-4fdd-b755-10cab62e1370" (UID: "67e76e77-4199-4fdd-b755-10cab62e1370"). InnerVolumeSpecName "kube-api-access-pb8r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.510074 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143a8092-b930-4ffb-8414-eda1d808fb8c-kube-api-access-rl6v9" (OuterVolumeSpecName: "kube-api-access-rl6v9") pod "143a8092-b930-4ffb-8414-eda1d808fb8c" (UID: "143a8092-b930-4ffb-8414-eda1d808fb8c"). InnerVolumeSpecName "kube-api-access-rl6v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.510319 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/548554c3-21d2-4406-a509-e80303628f56-kube-api-access-n76fb" (OuterVolumeSpecName: "kube-api-access-n76fb") pod "548554c3-21d2-4406-a509-e80303628f56" (UID: "548554c3-21d2-4406-a509-e80303628f56"). InnerVolumeSpecName "kube-api-access-n76fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.511251 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143a8092-b930-4ffb-8414-eda1d808fb8c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "143a8092-b930-4ffb-8414-eda1d808fb8c" (UID: "143a8092-b930-4ffb-8414-eda1d808fb8c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.528372 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/830768c7-49e2-4ed5-af8e-3762dc00534e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "830768c7-49e2-4ed5-af8e-3762dc00534e" (UID: "830768c7-49e2-4ed5-af8e-3762dc00534e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.567829 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wgkcv"] Mar 20 14:58:40 crc kubenswrapper[4764]: W0320 14:58:40.572860 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad389133_cb52_4ed9_9261_daeb7a5cb13e.slice/crio-fd903f4fad45412993e78c98b6512160a2b26c0b8e0f35f00e8ee0dbe2612019 WatchSource:0}: Error finding container fd903f4fad45412993e78c98b6512160a2b26c0b8e0f35f00e8ee0dbe2612019: Status 404 returned error can't find the container with id fd903f4fad45412993e78c98b6512160a2b26c0b8e0f35f00e8ee0dbe2612019 Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.591948 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67e76e77-4199-4fdd-b755-10cab62e1370-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67e76e77-4199-4fdd-b755-10cab62e1370" (UID: "67e76e77-4199-4fdd-b755-10cab62e1370"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.604495 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/143a8092-b930-4ffb-8414-eda1d808fb8c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.604524 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj459\" (UniqueName: \"kubernetes.io/projected/830768c7-49e2-4ed5-af8e-3762dc00534e-kube-api-access-dj459\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.604534 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/548554c3-21d2-4406-a509-e80303628f56-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.604545 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/830768c7-49e2-4ed5-af8e-3762dc00534e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.604554 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/143a8092-b930-4ffb-8414-eda1d808fb8c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.604562 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl6v9\" (UniqueName: \"kubernetes.io/projected/143a8092-b930-4ffb-8414-eda1d808fb8c-kube-api-access-rl6v9\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.604570 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e76e77-4199-4fdd-b755-10cab62e1370-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.604578 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e76e77-4199-4fdd-b755-10cab62e1370-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.604586 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/830768c7-49e2-4ed5-af8e-3762dc00534e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.604596 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n76fb\" (UniqueName: \"kubernetes.io/projected/548554c3-21d2-4406-a509-e80303628f56-kube-api-access-n76fb\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.604606 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb8r2\" (UniqueName: \"kubernetes.io/projected/67e76e77-4199-4fdd-b755-10cab62e1370-kube-api-access-pb8r2\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.610412 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/548554c3-21d2-4406-a509-e80303628f56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "548554c3-21d2-4406-a509-e80303628f56" (UID: "548554c3-21d2-4406-a509-e80303628f56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.706183 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/548554c3-21d2-4406-a509-e80303628f56-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.727350 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.907969 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a051746-92b7-4a16-a641-d73888dcfcca-catalog-content\") pod \"4a051746-92b7-4a16-a641-d73888dcfcca\" (UID: \"4a051746-92b7-4a16-a641-d73888dcfcca\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.908039 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6xz7\" (UniqueName: \"kubernetes.io/projected/4a051746-92b7-4a16-a641-d73888dcfcca-kube-api-access-n6xz7\") pod \"4a051746-92b7-4a16-a641-d73888dcfcca\" (UID: \"4a051746-92b7-4a16-a641-d73888dcfcca\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.908103 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a051746-92b7-4a16-a641-d73888dcfcca-utilities\") pod \"4a051746-92b7-4a16-a641-d73888dcfcca\" (UID: \"4a051746-92b7-4a16-a641-d73888dcfcca\") " Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.909086 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a051746-92b7-4a16-a641-d73888dcfcca-utilities" (OuterVolumeSpecName: "utilities") pod "4a051746-92b7-4a16-a641-d73888dcfcca" (UID: "4a051746-92b7-4a16-a641-d73888dcfcca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:58:40 crc kubenswrapper[4764]: I0320 14:58:40.915557 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a051746-92b7-4a16-a641-d73888dcfcca-kube-api-access-n6xz7" (OuterVolumeSpecName: "kube-api-access-n6xz7") pod "4a051746-92b7-4a16-a641-d73888dcfcca" (UID: "4a051746-92b7-4a16-a641-d73888dcfcca"). InnerVolumeSpecName "kube-api-access-n6xz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.009282 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a051746-92b7-4a16-a641-d73888dcfcca-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.009307 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6xz7\" (UniqueName: \"kubernetes.io/projected/4a051746-92b7-4a16-a641-d73888dcfcca-kube-api-access-n6xz7\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.103247 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a051746-92b7-4a16-a641-d73888dcfcca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a051746-92b7-4a16-a641-d73888dcfcca" (UID: "4a051746-92b7-4a16-a641-d73888dcfcca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.109990 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a051746-92b7-4a16-a641-d73888dcfcca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.335641 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" event={"ID":"ad389133-cb52-4ed9-9261-daeb7a5cb13e","Type":"ContainerStarted","Data":"d1051c95e5f0c9e0dfa39579819dadabae97453be3e189b80dc29edca34258da"} Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.335691 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" event={"ID":"ad389133-cb52-4ed9-9261-daeb7a5cb13e","Type":"ContainerStarted","Data":"fd903f4fad45412993e78c98b6512160a2b26c0b8e0f35f00e8ee0dbe2612019"} Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.336048 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.339757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fzfpf" event={"ID":"4a051746-92b7-4a16-a641-d73888dcfcca","Type":"ContainerDied","Data":"c3b7e8fc8712ea940f2df816c005b5ec66ad3a157dc223b0d1f0c821d384a5a5"} Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.339802 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fzfpf" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.339826 4764 scope.go:117] "RemoveContainer" containerID="3942a276f8cb67ec8c9ec580140afcfecb1fe16ad139d8551771d3167d0896f5" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.340734 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.342517 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hkbns" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.342750 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmxv2" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.342780 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7m8j" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.342767 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44fr6" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.363421 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wgkcv" podStartSLOduration=2.363404826 podStartE2EDuration="2.363404826s" podCreationTimestamp="2026-03-20 14:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:58:41.358876878 +0000 UTC m=+442.975066067" watchObservedRunningTime="2026-03-20 14:58:41.363404826 +0000 UTC m=+442.979593955" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.372760 4764 scope.go:117] "RemoveContainer" containerID="578fa9430a1ab0030c466710bf7cc43c73d5be74517d9d892f7b1a0e9487eeda" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.375286 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7m8j"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.406532 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f7m8j"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.409801 4764 scope.go:117] "RemoveContainer" containerID="f70cad435924bb94f14bff90c4487561a5322a941d008a84edc5d8b1cc212fb4" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.431403 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hkbns"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.435684 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hkbns"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.444369 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cmxv2"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.456614 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cmxv2"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.476729 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fzfpf"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.477474 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fzfpf"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.481063 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-44fr6"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.493470 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-44fr6"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.498896 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-c9mxm"] Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499723 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e76e77-4199-4fdd-b755-10cab62e1370" containerName="extract-utilities" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499742 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e76e77-4199-4fdd-b755-10cab62e1370" containerName="extract-utilities" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499751 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a051746-92b7-4a16-a641-d73888dcfcca" containerName="extract-utilities" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499758 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a051746-92b7-4a16-a641-d73888dcfcca" containerName="extract-utilities" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499765 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548554c3-21d2-4406-a509-e80303628f56" containerName="extract-content" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499772 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="548554c3-21d2-4406-a509-e80303628f56" containerName="extract-content" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499778 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548554c3-21d2-4406-a509-e80303628f56" containerName="registry-server" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499783 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="548554c3-21d2-4406-a509-e80303628f56" containerName="registry-server" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499793 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e76e77-4199-4fdd-b755-10cab62e1370" containerName="registry-server" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499798 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e76e77-4199-4fdd-b755-10cab62e1370" containerName="registry-server" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499810 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="830768c7-49e2-4ed5-af8e-3762dc00534e" containerName="extract-content" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499815 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="830768c7-49e2-4ed5-af8e-3762dc00534e" containerName="extract-content" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499823 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="830768c7-49e2-4ed5-af8e-3762dc00534e" containerName="registry-server" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499828 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="830768c7-49e2-4ed5-af8e-3762dc00534e" containerName="registry-server" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499838 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a051746-92b7-4a16-a641-d73888dcfcca" containerName="registry-server" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499844 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a051746-92b7-4a16-a641-d73888dcfcca" containerName="registry-server" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499853 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143a8092-b930-4ffb-8414-eda1d808fb8c" containerName="marketplace-operator" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499859 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="143a8092-b930-4ffb-8414-eda1d808fb8c" containerName="marketplace-operator" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499868 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e76e77-4199-4fdd-b755-10cab62e1370" containerName="extract-content" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499873 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e76e77-4199-4fdd-b755-10cab62e1370" containerName="extract-content" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499883 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143a8092-b930-4ffb-8414-eda1d808fb8c" containerName="marketplace-operator" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499889 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="143a8092-b930-4ffb-8414-eda1d808fb8c" containerName="marketplace-operator" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499895 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="830768c7-49e2-4ed5-af8e-3762dc00534e" containerName="extract-utilities" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499901 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="830768c7-49e2-4ed5-af8e-3762dc00534e" containerName="extract-utilities" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499911 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548554c3-21d2-4406-a509-e80303628f56" containerName="extract-utilities" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499916 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="548554c3-21d2-4406-a509-e80303628f56" containerName="extract-utilities" Mar 20 14:58:41 crc kubenswrapper[4764]: E0320 14:58:41.499926 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a051746-92b7-4a16-a641-d73888dcfcca" containerName="extract-content" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.499931 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a051746-92b7-4a16-a641-d73888dcfcca" containerName="extract-content" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.500011 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="548554c3-21d2-4406-a509-e80303628f56" containerName="registry-server" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.500115 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a051746-92b7-4a16-a641-d73888dcfcca" containerName="registry-server" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.500131 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="143a8092-b930-4ffb-8414-eda1d808fb8c" containerName="marketplace-operator" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.500146 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="830768c7-49e2-4ed5-af8e-3762dc00534e" containerName="registry-server" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.500155 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="143a8092-b930-4ffb-8414-eda1d808fb8c" containerName="marketplace-operator" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.500164 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e76e77-4199-4fdd-b755-10cab62e1370" containerName="registry-server" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.503079 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.504962 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-c9mxm"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.615934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.615981 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-registry-certificates\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.616016 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-registry-tls\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.616032 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-ca-trust-extracted\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.616050 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-trusted-ca\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.616081 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-installation-pull-secrets\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.616214 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84c5f\" (UniqueName: \"kubernetes.io/projected/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-kube-api-access-84c5f\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.616372 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-bound-sa-token\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.647758 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.718244 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84c5f\" (UniqueName: \"kubernetes.io/projected/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-kube-api-access-84c5f\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.718337 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-bound-sa-token\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.718470 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-registry-certificates\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.718535 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-registry-tls\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.718570 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-ca-trust-extracted\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.718607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-trusted-ca\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.718689 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-installation-pull-secrets\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.719357 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-ca-trust-extracted\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.720463 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-trusted-ca\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.720596 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-registry-certificates\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.736942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-installation-pull-secrets\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.737337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-registry-tls\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.739333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-bound-sa-token\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.741827 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84c5f\" (UniqueName: \"kubernetes.io/projected/d7060914-efe9-4bf7-b44f-a4bbd64b9e94-kube-api-access-84c5f\") pod \"image-registry-66df7c8f76-c9mxm\" (UID: \"d7060914-efe9-4bf7-b44f-a4bbd64b9e94\") " pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.811447 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dddff"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.813471 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.818183 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.821805 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.827355 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dddff"] Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.921113 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68519d16-b007-47ac-9070-20ac1b3c3b2f-catalog-content\") pod \"certified-operators-dddff\" (UID: \"68519d16-b007-47ac-9070-20ac1b3c3b2f\") " pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.921174 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thldw\" (UniqueName: \"kubernetes.io/projected/68519d16-b007-47ac-9070-20ac1b3c3b2f-kube-api-access-thldw\") pod \"certified-operators-dddff\" (UID: \"68519d16-b007-47ac-9070-20ac1b3c3b2f\") " pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:41 crc kubenswrapper[4764]: I0320 14:58:41.921285 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68519d16-b007-47ac-9070-20ac1b3c3b2f-utilities\") pod \"certified-operators-dddff\" (UID: \"68519d16-b007-47ac-9070-20ac1b3c3b2f\") " pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.022458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thldw\" (UniqueName: \"kubernetes.io/projected/68519d16-b007-47ac-9070-20ac1b3c3b2f-kube-api-access-thldw\") pod \"certified-operators-dddff\" (UID: \"68519d16-b007-47ac-9070-20ac1b3c3b2f\") " pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.022829 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68519d16-b007-47ac-9070-20ac1b3c3b2f-utilities\") pod \"certified-operators-dddff\" (UID: \"68519d16-b007-47ac-9070-20ac1b3c3b2f\") " pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.022893 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68519d16-b007-47ac-9070-20ac1b3c3b2f-catalog-content\") pod \"certified-operators-dddff\" (UID: \"68519d16-b007-47ac-9070-20ac1b3c3b2f\") " pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.023545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68519d16-b007-47ac-9070-20ac1b3c3b2f-catalog-content\") pod \"certified-operators-dddff\" (UID: \"68519d16-b007-47ac-9070-20ac1b3c3b2f\") " pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.023822 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68519d16-b007-47ac-9070-20ac1b3c3b2f-utilities\") pod \"certified-operators-dddff\" (UID: \"68519d16-b007-47ac-9070-20ac1b3c3b2f\") " pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.045102 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thldw\" (UniqueName: \"kubernetes.io/projected/68519d16-b007-47ac-9070-20ac1b3c3b2f-kube-api-access-thldw\") pod \"certified-operators-dddff\" (UID: \"68519d16-b007-47ac-9070-20ac1b3c3b2f\") " pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.061876 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-c9mxm"] Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.148834 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.351501 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" event={"ID":"d7060914-efe9-4bf7-b44f-a4bbd64b9e94","Type":"ContainerStarted","Data":"d038ed04c4fa36bca9ea72ad62bd0c80e49cc89ba92712d89f47cee7e280b15d"} Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.351740 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" event={"ID":"d7060914-efe9-4bf7-b44f-a4bbd64b9e94","Type":"ContainerStarted","Data":"7bf1c14ce6a05fd10e6421f4dfcca0e681dac0f71a06affb000da65902d7abb5"} Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.351754 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.372463 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" podStartSLOduration=1.372447928 podStartE2EDuration="1.372447928s" podCreationTimestamp="2026-03-20 14:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:58:42.370750163 +0000 UTC m=+443.986939282" watchObservedRunningTime="2026-03-20 14:58:42.372447928 +0000 UTC m=+443.988637057" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.413942 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-25sqr"] Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.414875 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.417569 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.418424 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dddff"] Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.421915 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-25sqr"] Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.527279 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da442f8-8370-4046-9d12-a9a301d61a58-catalog-content\") pod \"redhat-marketplace-25sqr\" (UID: \"5da442f8-8370-4046-9d12-a9a301d61a58\") " pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.527330 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da442f8-8370-4046-9d12-a9a301d61a58-utilities\") pod \"redhat-marketplace-25sqr\" (UID: \"5da442f8-8370-4046-9d12-a9a301d61a58\") " pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.527348 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cjj9\" (UniqueName: \"kubernetes.io/projected/5da442f8-8370-4046-9d12-a9a301d61a58-kube-api-access-6cjj9\") pod \"redhat-marketplace-25sqr\" (UID: \"5da442f8-8370-4046-9d12-a9a301d61a58\") " pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.628897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da442f8-8370-4046-9d12-a9a301d61a58-catalog-content\") pod \"redhat-marketplace-25sqr\" (UID: \"5da442f8-8370-4046-9d12-a9a301d61a58\") " pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.628944 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da442f8-8370-4046-9d12-a9a301d61a58-utilities\") pod \"redhat-marketplace-25sqr\" (UID: \"5da442f8-8370-4046-9d12-a9a301d61a58\") " pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.628963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cjj9\" (UniqueName: \"kubernetes.io/projected/5da442f8-8370-4046-9d12-a9a301d61a58-kube-api-access-6cjj9\") pod \"redhat-marketplace-25sqr\" (UID: \"5da442f8-8370-4046-9d12-a9a301d61a58\") " pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.629625 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da442f8-8370-4046-9d12-a9a301d61a58-catalog-content\") pod \"redhat-marketplace-25sqr\" (UID: \"5da442f8-8370-4046-9d12-a9a301d61a58\") " pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.630061 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da442f8-8370-4046-9d12-a9a301d61a58-utilities\") pod \"redhat-marketplace-25sqr\" (UID: \"5da442f8-8370-4046-9d12-a9a301d61a58\") " pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.646913 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cjj9\" (UniqueName: \"kubernetes.io/projected/5da442f8-8370-4046-9d12-a9a301d61a58-kube-api-access-6cjj9\") pod \"redhat-marketplace-25sqr\" (UID: \"5da442f8-8370-4046-9d12-a9a301d61a58\") " pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:42 crc kubenswrapper[4764]: I0320 14:58:42.736019 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:43 crc kubenswrapper[4764]: I0320 14:58:43.136178 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="143a8092-b930-4ffb-8414-eda1d808fb8c" path="/var/lib/kubelet/pods/143a8092-b930-4ffb-8414-eda1d808fb8c/volumes" Mar 20 14:58:43 crc kubenswrapper[4764]: I0320 14:58:43.137525 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a051746-92b7-4a16-a641-d73888dcfcca" path="/var/lib/kubelet/pods/4a051746-92b7-4a16-a641-d73888dcfcca/volumes" Mar 20 14:58:43 crc kubenswrapper[4764]: I0320 14:58:43.140664 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="548554c3-21d2-4406-a509-e80303628f56" path="/var/lib/kubelet/pods/548554c3-21d2-4406-a509-e80303628f56/volumes" Mar 20 14:58:43 crc kubenswrapper[4764]: I0320 14:58:43.144040 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e76e77-4199-4fdd-b755-10cab62e1370" path="/var/lib/kubelet/pods/67e76e77-4199-4fdd-b755-10cab62e1370/volumes" Mar 20 14:58:43 crc kubenswrapper[4764]: I0320 14:58:43.147696 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="830768c7-49e2-4ed5-af8e-3762dc00534e" path="/var/lib/kubelet/pods/830768c7-49e2-4ed5-af8e-3762dc00534e/volumes" Mar 20 14:58:43 crc kubenswrapper[4764]: I0320 14:58:43.148404 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-25sqr"] Mar 20 14:58:43 crc kubenswrapper[4764]: I0320 14:58:43.359214 4764 generic.go:334] "Generic (PLEG): container finished" podID="5da442f8-8370-4046-9d12-a9a301d61a58" containerID="905426df2673c07f54e5a4ee6641b3f085184013614b4fc8aaf5721d0a7aaf60" exitCode=0 Mar 20 14:58:43 crc kubenswrapper[4764]: I0320 14:58:43.359355 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-25sqr" event={"ID":"5da442f8-8370-4046-9d12-a9a301d61a58","Type":"ContainerDied","Data":"905426df2673c07f54e5a4ee6641b3f085184013614b4fc8aaf5721d0a7aaf60"} Mar 20 14:58:43 crc kubenswrapper[4764]: I0320 14:58:43.359599 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-25sqr" event={"ID":"5da442f8-8370-4046-9d12-a9a301d61a58","Type":"ContainerStarted","Data":"656863e20b56a90acf2e6a02481d66bf345c78f27c04839cd829a40039ff5ce9"} Mar 20 14:58:43 crc kubenswrapper[4764]: I0320 14:58:43.360698 4764 generic.go:334] "Generic (PLEG): container finished" podID="68519d16-b007-47ac-9070-20ac1b3c3b2f" containerID="84147b9f85c42d73e189020bf74bad98f70a9fde7bfda5934c261500aaaf7c5a" exitCode=0 Mar 20 14:58:43 crc kubenswrapper[4764]: I0320 14:58:43.361016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dddff" event={"ID":"68519d16-b007-47ac-9070-20ac1b3c3b2f","Type":"ContainerDied","Data":"84147b9f85c42d73e189020bf74bad98f70a9fde7bfda5934c261500aaaf7c5a"} Mar 20 14:58:43 crc kubenswrapper[4764]: I0320 14:58:43.361321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dddff" event={"ID":"68519d16-b007-47ac-9070-20ac1b3c3b2f","Type":"ContainerStarted","Data":"4c4dacd55704e279cc426154313d2810c8a4d5177f82a96d64871598975f4410"} Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.218755 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2672k"] Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.220905 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.225256 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.230960 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2672k"] Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.356394 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ad1633-003c-4d92-bdfa-c0f6c1957cfd-utilities\") pod \"redhat-operators-2672k\" (UID: \"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd\") " pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.356713 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whbk8\" (UniqueName: \"kubernetes.io/projected/d3ad1633-003c-4d92-bdfa-c0f6c1957cfd-kube-api-access-whbk8\") pod \"redhat-operators-2672k\" (UID: \"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd\") " pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.356757 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ad1633-003c-4d92-bdfa-c0f6c1957cfd-catalog-content\") pod \"redhat-operators-2672k\" (UID: \"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd\") " pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.374051 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-25sqr" event={"ID":"5da442f8-8370-4046-9d12-a9a301d61a58","Type":"ContainerStarted","Data":"6350c49089f683112be6b5f33c5fcc7bee47237ba7344b2d0525e22478a52ea6"} Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.458229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whbk8\" (UniqueName: \"kubernetes.io/projected/d3ad1633-003c-4d92-bdfa-c0f6c1957cfd-kube-api-access-whbk8\") pod \"redhat-operators-2672k\" (UID: \"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd\") " pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.458307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ad1633-003c-4d92-bdfa-c0f6c1957cfd-catalog-content\") pod \"redhat-operators-2672k\" (UID: \"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd\") " pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.458411 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ad1633-003c-4d92-bdfa-c0f6c1957cfd-utilities\") pod \"redhat-operators-2672k\" (UID: \"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd\") " pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.458819 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3ad1633-003c-4d92-bdfa-c0f6c1957cfd-catalog-content\") pod \"redhat-operators-2672k\" (UID: \"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd\") " pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.458935 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3ad1633-003c-4d92-bdfa-c0f6c1957cfd-utilities\") pod \"redhat-operators-2672k\" (UID: \"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd\") " pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.478405 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whbk8\" (UniqueName: \"kubernetes.io/projected/d3ad1633-003c-4d92-bdfa-c0f6c1957cfd-kube-api-access-whbk8\") pod \"redhat-operators-2672k\" (UID: \"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd\") " pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.615090 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.817085 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d4td2"] Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.819682 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.825004 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4td2"] Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.827424 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.861136 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2672k"] Mar 20 14:58:44 crc kubenswrapper[4764]: W0320 14:58:44.864126 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3ad1633_003c_4d92_bdfa_c0f6c1957cfd.slice/crio-f6ea1a2297c2d76f00f8ba4da030c1737ce876f88d5b001c85a57dbdc0f8ef47 WatchSource:0}: Error finding container f6ea1a2297c2d76f00f8ba4da030c1737ce876f88d5b001c85a57dbdc0f8ef47: Status 404 returned error can't find the container with id f6ea1a2297c2d76f00f8ba4da030c1737ce876f88d5b001c85a57dbdc0f8ef47 Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.966672 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/656d8d0a-7ccf-4216-bde0-8e12af697dc0-catalog-content\") pod \"community-operators-d4td2\" (UID: \"656d8d0a-7ccf-4216-bde0-8e12af697dc0\") " pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.966738 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/656d8d0a-7ccf-4216-bde0-8e12af697dc0-utilities\") pod \"community-operators-d4td2\" (UID: \"656d8d0a-7ccf-4216-bde0-8e12af697dc0\") " pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:44 crc kubenswrapper[4764]: I0320 14:58:44.966821 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh2vl\" (UniqueName: \"kubernetes.io/projected/656d8d0a-7ccf-4216-bde0-8e12af697dc0-kube-api-access-hh2vl\") pod \"community-operators-d4td2\" (UID: \"656d8d0a-7ccf-4216-bde0-8e12af697dc0\") " pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:45 crc kubenswrapper[4764]: I0320 14:58:45.068528 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/656d8d0a-7ccf-4216-bde0-8e12af697dc0-catalog-content\") pod \"community-operators-d4td2\" (UID: \"656d8d0a-7ccf-4216-bde0-8e12af697dc0\") " pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:45 crc kubenswrapper[4764]: I0320 14:58:45.068719 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/656d8d0a-7ccf-4216-bde0-8e12af697dc0-utilities\") pod \"community-operators-d4td2\" (UID: \"656d8d0a-7ccf-4216-bde0-8e12af697dc0\") " pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:45 crc kubenswrapper[4764]: I0320 14:58:45.068936 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh2vl\" (UniqueName: \"kubernetes.io/projected/656d8d0a-7ccf-4216-bde0-8e12af697dc0-kube-api-access-hh2vl\") pod \"community-operators-d4td2\" (UID: \"656d8d0a-7ccf-4216-bde0-8e12af697dc0\") " pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:45 crc kubenswrapper[4764]: I0320 14:58:45.069271 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/656d8d0a-7ccf-4216-bde0-8e12af697dc0-utilities\") pod \"community-operators-d4td2\" (UID: \"656d8d0a-7ccf-4216-bde0-8e12af697dc0\") " pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:45 crc kubenswrapper[4764]: I0320 14:58:45.069288 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/656d8d0a-7ccf-4216-bde0-8e12af697dc0-catalog-content\") pod \"community-operators-d4td2\" (UID: \"656d8d0a-7ccf-4216-bde0-8e12af697dc0\") " pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:45 crc kubenswrapper[4764]: I0320 14:58:45.097582 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh2vl\" (UniqueName: \"kubernetes.io/projected/656d8d0a-7ccf-4216-bde0-8e12af697dc0-kube-api-access-hh2vl\") pod \"community-operators-d4td2\" (UID: \"656d8d0a-7ccf-4216-bde0-8e12af697dc0\") " pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:45 crc kubenswrapper[4764]: I0320 14:58:45.141925 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:45.382282 4764 generic.go:334] "Generic (PLEG): container finished" podID="5da442f8-8370-4046-9d12-a9a301d61a58" containerID="6350c49089f683112be6b5f33c5fcc7bee47237ba7344b2d0525e22478a52ea6" exitCode=0 Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:45.382365 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-25sqr" event={"ID":"5da442f8-8370-4046-9d12-a9a301d61a58","Type":"ContainerDied","Data":"6350c49089f683112be6b5f33c5fcc7bee47237ba7344b2d0525e22478a52ea6"} Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:45.384179 4764 generic.go:334] "Generic (PLEG): container finished" podID="d3ad1633-003c-4d92-bdfa-c0f6c1957cfd" containerID="9797887486c353e6754a414c36ba28d0fa5223e0d6836ba3b615eb2d84ba9013" exitCode=0 Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:45.384229 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2672k" event={"ID":"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd","Type":"ContainerDied","Data":"9797887486c353e6754a414c36ba28d0fa5223e0d6836ba3b615eb2d84ba9013"} Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:45.384312 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2672k" event={"ID":"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd","Type":"ContainerStarted","Data":"f6ea1a2297c2d76f00f8ba4da030c1737ce876f88d5b001c85a57dbdc0f8ef47"} Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:45.388398 4764 generic.go:334] "Generic (PLEG): container finished" podID="68519d16-b007-47ac-9070-20ac1b3c3b2f" containerID="13a7f130e7fb057faf702557b217316f43faab23053aa36af7c6c68213fdd314" exitCode=0 Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:45.388456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dddff" event={"ID":"68519d16-b007-47ac-9070-20ac1b3c3b2f","Type":"ContainerDied","Data":"13a7f130e7fb057faf702557b217316f43faab23053aa36af7c6c68213fdd314"} Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:46.397581 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2672k" event={"ID":"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd","Type":"ContainerStarted","Data":"c6cba1afaf8f7049eefe993496059e52dc694ed88a2904d9e6d764846eb84aa1"} Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:46.409678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dddff" event={"ID":"68519d16-b007-47ac-9070-20ac1b3c3b2f","Type":"ContainerStarted","Data":"3a74b9ba177acd399ca22e5a535523bb887f36b7c8781adbb26aff223569f04c"} Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:46.411969 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-25sqr" event={"ID":"5da442f8-8370-4046-9d12-a9a301d61a58","Type":"ContainerStarted","Data":"e4a40cc09c5e630550f80013c67f12e97a5609d7af9433b3cb080339494b9ecf"} Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:46.434613 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-25sqr" podStartSLOduration=1.734894581 podStartE2EDuration="4.434597631s" podCreationTimestamp="2026-03-20 14:58:42 +0000 UTC" firstStartedPulling="2026-03-20 14:58:43.361715162 +0000 UTC m=+444.977904291" lastFinishedPulling="2026-03-20 14:58:46.061418212 +0000 UTC m=+447.677607341" observedRunningTime="2026-03-20 14:58:46.433757869 +0000 UTC m=+448.049946998" watchObservedRunningTime="2026-03-20 14:58:46.434597631 +0000 UTC m=+448.050786760" Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:46.451813 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dddff" podStartSLOduration=2.693649255 podStartE2EDuration="5.451787051s" podCreationTimestamp="2026-03-20 14:58:41 +0000 UTC" firstStartedPulling="2026-03-20 14:58:43.364227918 +0000 UTC m=+444.980417047" lastFinishedPulling="2026-03-20 14:58:46.122365714 +0000 UTC m=+447.738554843" observedRunningTime="2026-03-20 14:58:46.449254374 +0000 UTC m=+448.065443503" watchObservedRunningTime="2026-03-20 14:58:46.451787051 +0000 UTC m=+448.067976210" Mar 20 14:58:46 crc kubenswrapper[4764]: I0320 14:58:46.610113 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4td2"] Mar 20 14:58:46 crc kubenswrapper[4764]: W0320 14:58:46.618439 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod656d8d0a_7ccf_4216_bde0_8e12af697dc0.slice/crio-397099da560e9e0da12c98fd7bd109e313f20cec2197f264ef3b2ae945ad48d4 WatchSource:0}: Error finding container 397099da560e9e0da12c98fd7bd109e313f20cec2197f264ef3b2ae945ad48d4: Status 404 returned error can't find the container with id 397099da560e9e0da12c98fd7bd109e313f20cec2197f264ef3b2ae945ad48d4 Mar 20 14:58:47 crc kubenswrapper[4764]: I0320 14:58:47.419001 4764 generic.go:334] "Generic (PLEG): container finished" podID="656d8d0a-7ccf-4216-bde0-8e12af697dc0" containerID="0c77e9bbd646af23d412dd25a370be370addb69f40a2ca90b30ae91eca6caf77" exitCode=0 Mar 20 14:58:47 crc kubenswrapper[4764]: I0320 14:58:47.419071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4td2" event={"ID":"656d8d0a-7ccf-4216-bde0-8e12af697dc0","Type":"ContainerDied","Data":"0c77e9bbd646af23d412dd25a370be370addb69f40a2ca90b30ae91eca6caf77"} Mar 20 14:58:47 crc kubenswrapper[4764]: I0320 14:58:47.419439 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4td2" event={"ID":"656d8d0a-7ccf-4216-bde0-8e12af697dc0","Type":"ContainerStarted","Data":"397099da560e9e0da12c98fd7bd109e313f20cec2197f264ef3b2ae945ad48d4"} Mar 20 14:58:47 crc kubenswrapper[4764]: I0320 14:58:47.423143 4764 generic.go:334] "Generic (PLEG): container finished" podID="d3ad1633-003c-4d92-bdfa-c0f6c1957cfd" containerID="c6cba1afaf8f7049eefe993496059e52dc694ed88a2904d9e6d764846eb84aa1" exitCode=0 Mar 20 14:58:47 crc kubenswrapper[4764]: I0320 14:58:47.423168 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2672k" event={"ID":"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd","Type":"ContainerDied","Data":"c6cba1afaf8f7049eefe993496059e52dc694ed88a2904d9e6d764846eb84aa1"} Mar 20 14:58:48 crc kubenswrapper[4764]: I0320 14:58:48.430963 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4td2" event={"ID":"656d8d0a-7ccf-4216-bde0-8e12af697dc0","Type":"ContainerStarted","Data":"7e3dde5d46252b075b6403565d2d256ee91c8ba477dbf5087a5e928489edc470"} Mar 20 14:58:48 crc kubenswrapper[4764]: I0320 14:58:48.437964 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2672k" event={"ID":"d3ad1633-003c-4d92-bdfa-c0f6c1957cfd","Type":"ContainerStarted","Data":"3a1bb501e4091c39a6e7d877d738892be4b03d53f6af18b19cf78ca3cfbeb1f7"} Mar 20 14:58:48 crc kubenswrapper[4764]: I0320 14:58:48.482991 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2672k" podStartSLOduration=1.952884025 podStartE2EDuration="4.482950324s" podCreationTimestamp="2026-03-20 14:58:44 +0000 UTC" firstStartedPulling="2026-03-20 14:58:45.386003526 +0000 UTC m=+447.002192675" lastFinishedPulling="2026-03-20 14:58:47.916069835 +0000 UTC m=+449.532258974" observedRunningTime="2026-03-20 14:58:48.475810298 +0000 UTC m=+450.091999467" watchObservedRunningTime="2026-03-20 14:58:48.482950324 +0000 UTC m=+450.099139483" Mar 20 14:58:49 crc kubenswrapper[4764]: I0320 14:58:49.457730 4764 generic.go:334] "Generic (PLEG): container finished" podID="656d8d0a-7ccf-4216-bde0-8e12af697dc0" containerID="7e3dde5d46252b075b6403565d2d256ee91c8ba477dbf5087a5e928489edc470" exitCode=0 Mar 20 14:58:49 crc kubenswrapper[4764]: I0320 14:58:49.458830 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4td2" event={"ID":"656d8d0a-7ccf-4216-bde0-8e12af697dc0","Type":"ContainerDied","Data":"7e3dde5d46252b075b6403565d2d256ee91c8ba477dbf5087a5e928489edc470"} Mar 20 14:58:50 crc kubenswrapper[4764]: I0320 14:58:50.467745 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4td2" event={"ID":"656d8d0a-7ccf-4216-bde0-8e12af697dc0","Type":"ContainerStarted","Data":"5691325c0d34c95cf625ebd90560f06e43e08ab84600c554cf78f27d72b8acc2"} Mar 20 14:58:50 crc kubenswrapper[4764]: I0320 14:58:50.500654 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d4td2" podStartSLOduration=4.074382501 podStartE2EDuration="6.500628776s" podCreationTimestamp="2026-03-20 14:58:44 +0000 UTC" firstStartedPulling="2026-03-20 14:58:47.420666033 +0000 UTC m=+449.036855182" lastFinishedPulling="2026-03-20 14:58:49.846912288 +0000 UTC m=+451.463101457" observedRunningTime="2026-03-20 14:58:50.493782897 +0000 UTC m=+452.109972056" watchObservedRunningTime="2026-03-20 14:58:50.500628776 +0000 UTC m=+452.116817945" Mar 20 14:58:52 crc kubenswrapper[4764]: I0320 14:58:52.149337 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:52 crc kubenswrapper[4764]: I0320 14:58:52.149440 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:52 crc kubenswrapper[4764]: I0320 14:58:52.199753 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:52 crc kubenswrapper[4764]: I0320 14:58:52.555968 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dddff" Mar 20 14:58:52 crc kubenswrapper[4764]: I0320 14:58:52.736804 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:52 crc kubenswrapper[4764]: I0320 14:58:52.736876 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:52 crc kubenswrapper[4764]: I0320 14:58:52.802105 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:53 crc kubenswrapper[4764]: I0320 14:58:53.561303 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-25sqr" Mar 20 14:58:54 crc kubenswrapper[4764]: I0320 14:58:54.615677 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:54 crc kubenswrapper[4764]: I0320 14:58:54.615749 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:58:55 crc kubenswrapper[4764]: I0320 14:58:55.390627 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:55 crc kubenswrapper[4764]: I0320 14:58:55.390688 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:55 crc kubenswrapper[4764]: I0320 14:58:55.427313 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:55 crc kubenswrapper[4764]: I0320 14:58:55.552284 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d4td2" Mar 20 14:58:55 crc kubenswrapper[4764]: I0320 14:58:55.684624 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2672k" podUID="d3ad1633-003c-4d92-bdfa-c0f6c1957cfd" containerName="registry-server" probeResult="failure" output=< Mar 20 14:58:55 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 20 14:58:55 crc kubenswrapper[4764]: > Mar 20 14:59:01 crc kubenswrapper[4764]: I0320 14:59:01.832073 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-c9mxm" Mar 20 14:59:01 crc kubenswrapper[4764]: I0320 14:59:01.926637 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8mljc"] Mar 20 14:59:04 crc kubenswrapper[4764]: I0320 14:59:04.674771 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:59:04 crc kubenswrapper[4764]: I0320 14:59:04.730276 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2672k" Mar 20 14:59:08 crc kubenswrapper[4764]: I0320 14:59:08.444164 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:59:08 crc kubenswrapper[4764]: I0320 14:59:08.444738 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:59:08 crc kubenswrapper[4764]: I0320 14:59:08.444804 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 14:59:08 crc kubenswrapper[4764]: I0320 14:59:08.445749 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9d6b1fc48480518320cf9896f1bb5b61939af837c02eef3fcf8ed18bad58336"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:59:08 crc kubenswrapper[4764]: I0320 14:59:08.445858 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://f9d6b1fc48480518320cf9896f1bb5b61939af837c02eef3fcf8ed18bad58336" gracePeriod=600 Mar 20 14:59:08 crc kubenswrapper[4764]: I0320 14:59:08.593277 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="f9d6b1fc48480518320cf9896f1bb5b61939af837c02eef3fcf8ed18bad58336" exitCode=0 Mar 20 14:59:08 crc kubenswrapper[4764]: I0320 14:59:08.593456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"f9d6b1fc48480518320cf9896f1bb5b61939af837c02eef3fcf8ed18bad58336"} Mar 20 14:59:08 crc kubenswrapper[4764]: I0320 14:59:08.593725 4764 scope.go:117] "RemoveContainer" containerID="a9ea8c7fdd84c93038167eff3ea9246b1f40c71af9253b4aadbce38d38b57f2c" Mar 20 14:59:09 crc kubenswrapper[4764]: I0320 14:59:09.603610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"e7f1bbc51003593363e1b74e35ea662eb207292352041beedda252c7cfb9003c"} Mar 20 14:59:26 crc kubenswrapper[4764]: I0320 14:59:26.978335 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" podUID="fd07d531-e6b9-4b58-9e6c-8012b3a473eb" containerName="registry" containerID="cri-o://87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885" gracePeriod=30 Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.452293 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.563107 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-bound-sa-token\") pod \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.563218 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-registry-certificates\") pod \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.563273 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-registry-tls\") pod \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.563336 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-installation-pull-secrets\") pod \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.563689 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.563786 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-ca-trust-extracted\") pod \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.563886 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-trusted-ca\") pod \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.563992 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-988cf\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-kube-api-access-988cf\") pod \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\" (UID: \"fd07d531-e6b9-4b58-9e6c-8012b3a473eb\") " Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.564669 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fd07d531-e6b9-4b58-9e6c-8012b3a473eb" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.564763 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fd07d531-e6b9-4b58-9e6c-8012b3a473eb" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.565091 4764 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.565131 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.574019 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fd07d531-e6b9-4b58-9e6c-8012b3a473eb" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.576463 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fd07d531-e6b9-4b58-9e6c-8012b3a473eb" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.581366 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fd07d531-e6b9-4b58-9e6c-8012b3a473eb" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.581859 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-kube-api-access-988cf" (OuterVolumeSpecName: "kube-api-access-988cf") pod "fd07d531-e6b9-4b58-9e6c-8012b3a473eb" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb"). InnerVolumeSpecName "kube-api-access-988cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.583863 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "fd07d531-e6b9-4b58-9e6c-8012b3a473eb" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.602208 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fd07d531-e6b9-4b58-9e6c-8012b3a473eb" (UID: "fd07d531-e6b9-4b58-9e6c-8012b3a473eb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.665762 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-988cf\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-kube-api-access-988cf\") on node \"crc\" DevicePath \"\"" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.665819 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.665840 4764 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.665861 4764 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.665879 4764 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd07d531-e6b9-4b58-9e6c-8012b3a473eb-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.725290 4764 generic.go:334] "Generic (PLEG): container finished" podID="fd07d531-e6b9-4b58-9e6c-8012b3a473eb" containerID="87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885" exitCode=0 Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.725357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" event={"ID":"fd07d531-e6b9-4b58-9e6c-8012b3a473eb","Type":"ContainerDied","Data":"87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885"} Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.725425 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" event={"ID":"fd07d531-e6b9-4b58-9e6c-8012b3a473eb","Type":"ContainerDied","Data":"1a7975ca39237d5d9eb2d3836d4599297e7739292f688daa7da83ca9b896c8df"} Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.725467 4764 scope.go:117] "RemoveContainer" containerID="87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.725693 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8mljc" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.776529 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8mljc"] Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.782817 4764 scope.go:117] "RemoveContainer" containerID="87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885" Mar 20 14:59:27 crc kubenswrapper[4764]: E0320 14:59:27.783521 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885\": container with ID starting with 87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885 not found: ID does not exist" containerID="87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.783761 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885"} err="failed to get container status \"87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885\": rpc error: code = NotFound desc = could not find container \"87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885\": container with ID starting with 87feee0a7b11e2328727ea0d639ea430cf006939da0025c9d4c493132a2b0885 not found: ID does not exist" Mar 20 14:59:27 crc kubenswrapper[4764]: I0320 14:59:27.791076 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8mljc"] Mar 20 14:59:29 crc kubenswrapper[4764]: I0320 14:59:29.134184 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd07d531-e6b9-4b58-9e6c-8012b3a473eb" path="/var/lib/kubelet/pods/fd07d531-e6b9-4b58-9e6c-8012b3a473eb/volumes" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.150712 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr"] Mar 20 15:00:00 crc kubenswrapper[4764]: E0320 15:00:00.151874 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd07d531-e6b9-4b58-9e6c-8012b3a473eb" containerName="registry" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.151901 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd07d531-e6b9-4b58-9e6c-8012b3a473eb" containerName="registry" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.152099 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd07d531-e6b9-4b58-9e6c-8012b3a473eb" containerName="registry" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.152838 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.155441 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.159087 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566980-sxrs8"] Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.160726 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566980-sxrs8" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.162222 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.164089 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.164729 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.165101 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566980-sxrs8"] Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.165298 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.171039 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr"] Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.256618 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1638bff-8fe7-45a5-a794-c40aa474724f-secret-volume\") pod \"collect-profiles-29566980-qm9kr\" (UID: \"a1638bff-8fe7-45a5-a794-c40aa474724f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.256823 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqlhx\" (UniqueName: \"kubernetes.io/projected/65a080aa-2f66-45e8-adb4-66cc7e2359e3-kube-api-access-wqlhx\") pod \"auto-csr-approver-29566980-sxrs8\" (UID: \"65a080aa-2f66-45e8-adb4-66cc7e2359e3\") " pod="openshift-infra/auto-csr-approver-29566980-sxrs8" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.256922 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfcgr\" (UniqueName: \"kubernetes.io/projected/a1638bff-8fe7-45a5-a794-c40aa474724f-kube-api-access-cfcgr\") pod \"collect-profiles-29566980-qm9kr\" (UID: \"a1638bff-8fe7-45a5-a794-c40aa474724f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.256994 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1638bff-8fe7-45a5-a794-c40aa474724f-config-volume\") pod \"collect-profiles-29566980-qm9kr\" (UID: \"a1638bff-8fe7-45a5-a794-c40aa474724f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.357935 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfcgr\" (UniqueName: \"kubernetes.io/projected/a1638bff-8fe7-45a5-a794-c40aa474724f-kube-api-access-cfcgr\") pod \"collect-profiles-29566980-qm9kr\" (UID: \"a1638bff-8fe7-45a5-a794-c40aa474724f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.358016 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1638bff-8fe7-45a5-a794-c40aa474724f-config-volume\") pod \"collect-profiles-29566980-qm9kr\" (UID: \"a1638bff-8fe7-45a5-a794-c40aa474724f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.358042 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1638bff-8fe7-45a5-a794-c40aa474724f-secret-volume\") pod \"collect-profiles-29566980-qm9kr\" (UID: \"a1638bff-8fe7-45a5-a794-c40aa474724f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.358088 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqlhx\" (UniqueName: \"kubernetes.io/projected/65a080aa-2f66-45e8-adb4-66cc7e2359e3-kube-api-access-wqlhx\") pod \"auto-csr-approver-29566980-sxrs8\" (UID: \"65a080aa-2f66-45e8-adb4-66cc7e2359e3\") " pod="openshift-infra/auto-csr-approver-29566980-sxrs8" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.359950 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1638bff-8fe7-45a5-a794-c40aa474724f-config-volume\") pod \"collect-profiles-29566980-qm9kr\" (UID: \"a1638bff-8fe7-45a5-a794-c40aa474724f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.369344 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1638bff-8fe7-45a5-a794-c40aa474724f-secret-volume\") pod \"collect-profiles-29566980-qm9kr\" (UID: \"a1638bff-8fe7-45a5-a794-c40aa474724f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.377106 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfcgr\" (UniqueName: \"kubernetes.io/projected/a1638bff-8fe7-45a5-a794-c40aa474724f-kube-api-access-cfcgr\") pod \"collect-profiles-29566980-qm9kr\" (UID: \"a1638bff-8fe7-45a5-a794-c40aa474724f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.377342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqlhx\" (UniqueName: \"kubernetes.io/projected/65a080aa-2f66-45e8-adb4-66cc7e2359e3-kube-api-access-wqlhx\") pod \"auto-csr-approver-29566980-sxrs8\" (UID: \"65a080aa-2f66-45e8-adb4-66cc7e2359e3\") " pod="openshift-infra/auto-csr-approver-29566980-sxrs8" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.484635 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.491443 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566980-sxrs8" Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.769255 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566980-sxrs8"] Mar 20 15:00:00 crc kubenswrapper[4764]: W0320 15:00:00.777996 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65a080aa_2f66_45e8_adb4_66cc7e2359e3.slice/crio-96bf23f6b40ee9c3cc8340f4ace7fcd4ef0be3387ad18cc21af9c3fe3ecf1f06 WatchSource:0}: Error finding container 96bf23f6b40ee9c3cc8340f4ace7fcd4ef0be3387ad18cc21af9c3fe3ecf1f06: Status 404 returned error can't find the container with id 96bf23f6b40ee9c3cc8340f4ace7fcd4ef0be3387ad18cc21af9c3fe3ecf1f06 Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.781474 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.867120 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr"] Mar 20 15:00:00 crc kubenswrapper[4764]: W0320 15:00:00.869819 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1638bff_8fe7_45a5_a794_c40aa474724f.slice/crio-9a9bdb2fa378667de138bf8dbaaa4908de28617be0e75560e59250e95f0e9cb9 WatchSource:0}: Error finding container 9a9bdb2fa378667de138bf8dbaaa4908de28617be0e75560e59250e95f0e9cb9: Status 404 returned error can't find the container with id 9a9bdb2fa378667de138bf8dbaaa4908de28617be0e75560e59250e95f0e9cb9 Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.985259 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" event={"ID":"a1638bff-8fe7-45a5-a794-c40aa474724f","Type":"ContainerStarted","Data":"9a9bdb2fa378667de138bf8dbaaa4908de28617be0e75560e59250e95f0e9cb9"} Mar 20 15:00:00 crc kubenswrapper[4764]: I0320 15:00:00.986780 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566980-sxrs8" event={"ID":"65a080aa-2f66-45e8-adb4-66cc7e2359e3","Type":"ContainerStarted","Data":"96bf23f6b40ee9c3cc8340f4ace7fcd4ef0be3387ad18cc21af9c3fe3ecf1f06"} Mar 20 15:00:01 crc kubenswrapper[4764]: I0320 15:00:01.996950 4764 generic.go:334] "Generic (PLEG): container finished" podID="a1638bff-8fe7-45a5-a794-c40aa474724f" containerID="7046506c8f1bb640699d4c3a42e4cc8d33e686c3dcda026b300ab6c66fea7262" exitCode=0 Mar 20 15:00:01 crc kubenswrapper[4764]: I0320 15:00:01.997066 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" event={"ID":"a1638bff-8fe7-45a5-a794-c40aa474724f","Type":"ContainerDied","Data":"7046506c8f1bb640699d4c3a42e4cc8d33e686c3dcda026b300ab6c66fea7262"} Mar 20 15:00:03 crc kubenswrapper[4764]: I0320 15:00:03.302614 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:03 crc kubenswrapper[4764]: I0320 15:00:03.401433 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1638bff-8fe7-45a5-a794-c40aa474724f-config-volume\") pod \"a1638bff-8fe7-45a5-a794-c40aa474724f\" (UID: \"a1638bff-8fe7-45a5-a794-c40aa474724f\") " Mar 20 15:00:03 crc kubenswrapper[4764]: I0320 15:00:03.401537 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfcgr\" (UniqueName: \"kubernetes.io/projected/a1638bff-8fe7-45a5-a794-c40aa474724f-kube-api-access-cfcgr\") pod \"a1638bff-8fe7-45a5-a794-c40aa474724f\" (UID: \"a1638bff-8fe7-45a5-a794-c40aa474724f\") " Mar 20 15:00:03 crc kubenswrapper[4764]: I0320 15:00:03.401698 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1638bff-8fe7-45a5-a794-c40aa474724f-secret-volume\") pod \"a1638bff-8fe7-45a5-a794-c40aa474724f\" (UID: \"a1638bff-8fe7-45a5-a794-c40aa474724f\") " Mar 20 15:00:03 crc kubenswrapper[4764]: I0320 15:00:03.402882 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1638bff-8fe7-45a5-a794-c40aa474724f-config-volume" (OuterVolumeSpecName: "config-volume") pod "a1638bff-8fe7-45a5-a794-c40aa474724f" (UID: "a1638bff-8fe7-45a5-a794-c40aa474724f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:00:03 crc kubenswrapper[4764]: I0320 15:00:03.407052 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1638bff-8fe7-45a5-a794-c40aa474724f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a1638bff-8fe7-45a5-a794-c40aa474724f" (UID: "a1638bff-8fe7-45a5-a794-c40aa474724f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:00:03 crc kubenswrapper[4764]: I0320 15:00:03.408329 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1638bff-8fe7-45a5-a794-c40aa474724f-kube-api-access-cfcgr" (OuterVolumeSpecName: "kube-api-access-cfcgr") pod "a1638bff-8fe7-45a5-a794-c40aa474724f" (UID: "a1638bff-8fe7-45a5-a794-c40aa474724f"). InnerVolumeSpecName "kube-api-access-cfcgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:00:03 crc kubenswrapper[4764]: I0320 15:00:03.503405 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1638bff-8fe7-45a5-a794-c40aa474724f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:00:03 crc kubenswrapper[4764]: I0320 15:00:03.503438 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfcgr\" (UniqueName: \"kubernetes.io/projected/a1638bff-8fe7-45a5-a794-c40aa474724f-kube-api-access-cfcgr\") on node \"crc\" DevicePath \"\"" Mar 20 15:00:03 crc kubenswrapper[4764]: I0320 15:00:03.503448 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a1638bff-8fe7-45a5-a794-c40aa474724f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:00:04 crc kubenswrapper[4764]: I0320 15:00:04.012987 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" event={"ID":"a1638bff-8fe7-45a5-a794-c40aa474724f","Type":"ContainerDied","Data":"9a9bdb2fa378667de138bf8dbaaa4908de28617be0e75560e59250e95f0e9cb9"} Mar 20 15:00:04 crc kubenswrapper[4764]: I0320 15:00:04.013031 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr" Mar 20 15:00:04 crc kubenswrapper[4764]: I0320 15:00:04.013040 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a9bdb2fa378667de138bf8dbaaa4908de28617be0e75560e59250e95f0e9cb9" Mar 20 15:00:08 crc kubenswrapper[4764]: I0320 15:00:08.043436 4764 generic.go:334] "Generic (PLEG): container finished" podID="65a080aa-2f66-45e8-adb4-66cc7e2359e3" containerID="4953a8cd06c5cbd019a63cd17a6ffe752dfa0afe70106c7866d6506464f30880" exitCode=0 Mar 20 15:00:08 crc kubenswrapper[4764]: I0320 15:00:08.043524 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566980-sxrs8" event={"ID":"65a080aa-2f66-45e8-adb4-66cc7e2359e3","Type":"ContainerDied","Data":"4953a8cd06c5cbd019a63cd17a6ffe752dfa0afe70106c7866d6506464f30880"} Mar 20 15:00:09 crc kubenswrapper[4764]: I0320 15:00:09.371363 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566980-sxrs8" Mar 20 15:00:09 crc kubenswrapper[4764]: I0320 15:00:09.484150 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqlhx\" (UniqueName: \"kubernetes.io/projected/65a080aa-2f66-45e8-adb4-66cc7e2359e3-kube-api-access-wqlhx\") pod \"65a080aa-2f66-45e8-adb4-66cc7e2359e3\" (UID: \"65a080aa-2f66-45e8-adb4-66cc7e2359e3\") " Mar 20 15:00:09 crc kubenswrapper[4764]: I0320 15:00:09.494374 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a080aa-2f66-45e8-adb4-66cc7e2359e3-kube-api-access-wqlhx" (OuterVolumeSpecName: "kube-api-access-wqlhx") pod "65a080aa-2f66-45e8-adb4-66cc7e2359e3" (UID: "65a080aa-2f66-45e8-adb4-66cc7e2359e3"). InnerVolumeSpecName "kube-api-access-wqlhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:00:09 crc kubenswrapper[4764]: I0320 15:00:09.585987 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqlhx\" (UniqueName: \"kubernetes.io/projected/65a080aa-2f66-45e8-adb4-66cc7e2359e3-kube-api-access-wqlhx\") on node \"crc\" DevicePath \"\"" Mar 20 15:00:10 crc kubenswrapper[4764]: I0320 15:00:10.058768 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566980-sxrs8" event={"ID":"65a080aa-2f66-45e8-adb4-66cc7e2359e3","Type":"ContainerDied","Data":"96bf23f6b40ee9c3cc8340f4ace7fcd4ef0be3387ad18cc21af9c3fe3ecf1f06"} Mar 20 15:00:10 crc kubenswrapper[4764]: I0320 15:00:10.058810 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96bf23f6b40ee9c3cc8340f4ace7fcd4ef0be3387ad18cc21af9c3fe3ecf1f06" Mar 20 15:00:10 crc kubenswrapper[4764]: I0320 15:00:10.058854 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566980-sxrs8" Mar 20 15:00:10 crc kubenswrapper[4764]: I0320 15:00:10.451741 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566974-5687l"] Mar 20 15:00:10 crc kubenswrapper[4764]: I0320 15:00:10.459249 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566974-5687l"] Mar 20 15:00:11 crc kubenswrapper[4764]: I0320 15:00:11.152371 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f3afda6-923c-403a-994d-996da0ad0fee" path="/var/lib/kubelet/pods/1f3afda6-923c-403a-994d-996da0ad0fee/volumes" Mar 20 15:00:55 crc kubenswrapper[4764]: I0320 15:00:55.749407 4764 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-l759h container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 15:00:55 crc kubenswrapper[4764]: I0320 15:00:55.749854 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-l759h" podUID="0c6ef043-f571-4aff-90e8-a07752e9086c" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 15:01:08 crc kubenswrapper[4764]: I0320 15:01:08.443624 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:01:08 crc kubenswrapper[4764]: I0320 15:01:08.444258 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:01:33 crc kubenswrapper[4764]: I0320 15:01:33.616257 4764 scope.go:117] "RemoveContainer" containerID="c0faee55593f47a0854b211b39c799a6af96eef07aaea0e40978334e0faf40fd" Mar 20 15:01:33 crc kubenswrapper[4764]: I0320 15:01:33.642813 4764 scope.go:117] "RemoveContainer" containerID="b85cad5ad5cd32600a6d190dfb2ce71e7bd6053612f19e30699b8d1c9e344ada" Mar 20 15:01:33 crc kubenswrapper[4764]: I0320 15:01:33.675626 4764 scope.go:117] "RemoveContainer" containerID="75d921f0b333670dd3d0ece1a49bc4b52edebdb3da2335444db898d2f2e8b1d5" Mar 20 15:01:38 crc kubenswrapper[4764]: I0320 15:01:38.444416 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:01:38 crc kubenswrapper[4764]: I0320 15:01:38.444794 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.153094 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566982-z52lz"] Mar 20 15:02:00 crc kubenswrapper[4764]: E0320 15:02:00.154052 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a080aa-2f66-45e8-adb4-66cc7e2359e3" containerName="oc" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.154074 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a080aa-2f66-45e8-adb4-66cc7e2359e3" containerName="oc" Mar 20 15:02:00 crc kubenswrapper[4764]: E0320 15:02:00.154095 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1638bff-8fe7-45a5-a794-c40aa474724f" containerName="collect-profiles" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.154107 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1638bff-8fe7-45a5-a794-c40aa474724f" containerName="collect-profiles" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.154289 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a080aa-2f66-45e8-adb4-66cc7e2359e3" containerName="oc" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.154310 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1638bff-8fe7-45a5-a794-c40aa474724f" containerName="collect-profiles" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.154895 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566982-z52lz" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.157239 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.157620 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.158509 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.176998 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566982-z52lz"] Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.345896 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tj7t\" (UniqueName: \"kubernetes.io/projected/f93cfbb2-5a0d-4736-88a1-5658c1030a4b-kube-api-access-7tj7t\") pod \"auto-csr-approver-29566982-z52lz\" (UID: \"f93cfbb2-5a0d-4736-88a1-5658c1030a4b\") " pod="openshift-infra/auto-csr-approver-29566982-z52lz" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.447467 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tj7t\" (UniqueName: \"kubernetes.io/projected/f93cfbb2-5a0d-4736-88a1-5658c1030a4b-kube-api-access-7tj7t\") pod \"auto-csr-approver-29566982-z52lz\" (UID: \"f93cfbb2-5a0d-4736-88a1-5658c1030a4b\") " pod="openshift-infra/auto-csr-approver-29566982-z52lz" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.481099 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tj7t\" (UniqueName: \"kubernetes.io/projected/f93cfbb2-5a0d-4736-88a1-5658c1030a4b-kube-api-access-7tj7t\") pod \"auto-csr-approver-29566982-z52lz\" (UID: \"f93cfbb2-5a0d-4736-88a1-5658c1030a4b\") " pod="openshift-infra/auto-csr-approver-29566982-z52lz" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.484517 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566982-z52lz" Mar 20 15:02:00 crc kubenswrapper[4764]: I0320 15:02:00.920843 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566982-z52lz"] Mar 20 15:02:01 crc kubenswrapper[4764]: I0320 15:02:01.881707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566982-z52lz" event={"ID":"f93cfbb2-5a0d-4736-88a1-5658c1030a4b","Type":"ContainerStarted","Data":"ef107588ba1c0dece88569204746738f3b5dcfcf2f4e47c36712ac3fceaeb3a8"} Mar 20 15:02:03 crc kubenswrapper[4764]: I0320 15:02:03.897323 4764 generic.go:334] "Generic (PLEG): container finished" podID="f93cfbb2-5a0d-4736-88a1-5658c1030a4b" containerID="3a9022cc51e1ed5e9a1edc09e9e71e0223714085cfdf0ffea69a9bc48e0dfb0d" exitCode=0 Mar 20 15:02:03 crc kubenswrapper[4764]: I0320 15:02:03.897401 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566982-z52lz" event={"ID":"f93cfbb2-5a0d-4736-88a1-5658c1030a4b","Type":"ContainerDied","Data":"3a9022cc51e1ed5e9a1edc09e9e71e0223714085cfdf0ffea69a9bc48e0dfb0d"} Mar 20 15:02:05 crc kubenswrapper[4764]: I0320 15:02:05.197481 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566982-z52lz" Mar 20 15:02:05 crc kubenswrapper[4764]: I0320 15:02:05.316188 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tj7t\" (UniqueName: \"kubernetes.io/projected/f93cfbb2-5a0d-4736-88a1-5658c1030a4b-kube-api-access-7tj7t\") pod \"f93cfbb2-5a0d-4736-88a1-5658c1030a4b\" (UID: \"f93cfbb2-5a0d-4736-88a1-5658c1030a4b\") " Mar 20 15:02:05 crc kubenswrapper[4764]: I0320 15:02:05.326099 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93cfbb2-5a0d-4736-88a1-5658c1030a4b-kube-api-access-7tj7t" (OuterVolumeSpecName: "kube-api-access-7tj7t") pod "f93cfbb2-5a0d-4736-88a1-5658c1030a4b" (UID: "f93cfbb2-5a0d-4736-88a1-5658c1030a4b"). InnerVolumeSpecName "kube-api-access-7tj7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:02:05 crc kubenswrapper[4764]: I0320 15:02:05.417741 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tj7t\" (UniqueName: \"kubernetes.io/projected/f93cfbb2-5a0d-4736-88a1-5658c1030a4b-kube-api-access-7tj7t\") on node \"crc\" DevicePath \"\"" Mar 20 15:02:05 crc kubenswrapper[4764]: I0320 15:02:05.915998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566982-z52lz" event={"ID":"f93cfbb2-5a0d-4736-88a1-5658c1030a4b","Type":"ContainerDied","Data":"ef107588ba1c0dece88569204746738f3b5dcfcf2f4e47c36712ac3fceaeb3a8"} Mar 20 15:02:05 crc kubenswrapper[4764]: I0320 15:02:05.916054 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef107588ba1c0dece88569204746738f3b5dcfcf2f4e47c36712ac3fceaeb3a8" Mar 20 15:02:05 crc kubenswrapper[4764]: I0320 15:02:05.916425 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566982-z52lz" Mar 20 15:02:06 crc kubenswrapper[4764]: I0320 15:02:06.282638 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566976-h6xpn"] Mar 20 15:02:06 crc kubenswrapper[4764]: I0320 15:02:06.290120 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566976-h6xpn"] Mar 20 15:02:07 crc kubenswrapper[4764]: I0320 15:02:07.138933 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="190d359f-af45-40db-a90e-91bc465e6e1f" path="/var/lib/kubelet/pods/190d359f-af45-40db-a90e-91bc465e6e1f/volumes" Mar 20 15:02:08 crc kubenswrapper[4764]: I0320 15:02:08.443743 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:02:08 crc kubenswrapper[4764]: I0320 15:02:08.443833 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:02:08 crc kubenswrapper[4764]: I0320 15:02:08.443900 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 15:02:08 crc kubenswrapper[4764]: I0320 15:02:08.444723 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e7f1bbc51003593363e1b74e35ea662eb207292352041beedda252c7cfb9003c"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:02:08 crc kubenswrapper[4764]: I0320 15:02:08.444816 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://e7f1bbc51003593363e1b74e35ea662eb207292352041beedda252c7cfb9003c" gracePeriod=600 Mar 20 15:02:08 crc kubenswrapper[4764]: I0320 15:02:08.941631 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="e7f1bbc51003593363e1b74e35ea662eb207292352041beedda252c7cfb9003c" exitCode=0 Mar 20 15:02:08 crc kubenswrapper[4764]: I0320 15:02:08.941731 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"e7f1bbc51003593363e1b74e35ea662eb207292352041beedda252c7cfb9003c"} Mar 20 15:02:08 crc kubenswrapper[4764]: I0320 15:02:08.942074 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"d0a0145de00f1c32456f8ccedac5f6e372476de1dec21fbd6f506f2ab08b9e04"} Mar 20 15:02:08 crc kubenswrapper[4764]: I0320 15:02:08.942127 4764 scope.go:117] "RemoveContainer" containerID="f9d6b1fc48480518320cf9896f1bb5b61939af837c02eef3fcf8ed18bad58336" Mar 20 15:02:33 crc kubenswrapper[4764]: I0320 15:02:33.739860 4764 scope.go:117] "RemoveContainer" containerID="ca75eb84c7a50dfc81576a068eb0eaea7a16ab0d5b41cd44978dc3eb392e4d53" Mar 20 15:02:33 crc kubenswrapper[4764]: I0320 15:02:33.760648 4764 scope.go:117] "RemoveContainer" containerID="e90443574f4f719172f02718115464a18cce41db48c9e44c1aa44193f1c468a8" Mar 20 15:02:33 crc kubenswrapper[4764]: I0320 15:02:33.794116 4764 scope.go:117] "RemoveContainer" containerID="5a074b1406ab1ac382f7073b289cc8c6fe55630339d2f797bfa5c15530b7afc5" Mar 20 15:02:33 crc kubenswrapper[4764]: I0320 15:02:33.809661 4764 scope.go:117] "RemoveContainer" containerID="0fd969413f12cbe838fb83565a38f1f771fdbf84968956688e03cf62ebf643be" Mar 20 15:02:33 crc kubenswrapper[4764]: I0320 15:02:33.836932 4764 scope.go:117] "RemoveContainer" containerID="87888cbf290e8781fd7ca3233608cf4f80726b26ac2407ed8728ef03c2f93090" Mar 20 15:02:33 crc kubenswrapper[4764]: I0320 15:02:33.852881 4764 scope.go:117] "RemoveContainer" containerID="1b9deeba3ca0056c2b95e510afaddba5c76a8abf7b5acaf54d959db06f192793" Mar 20 15:02:33 crc kubenswrapper[4764]: I0320 15:02:33.895843 4764 scope.go:117] "RemoveContainer" containerID="979d0deadee6841075097d4296a42d6e48dd0f01f8c29c7a9373ca540b992b84" Mar 20 15:02:33 crc kubenswrapper[4764]: I0320 15:02:33.927854 4764 scope.go:117] "RemoveContainer" containerID="59150e1e0d63536c0f51e2f90ac24585b123050217841fefc6b6cebc3b3e6c70" Mar 20 15:03:34 crc kubenswrapper[4764]: I0320 15:03:34.025820 4764 scope.go:117] "RemoveContainer" containerID="0d0fbd440c8296db8ad3a9866cb5b928779aa02b85d6f9bff1e43e93ad722114" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.799706 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-ctqb5"] Mar 20 15:03:49 crc kubenswrapper[4764]: E0320 15:03:49.800193 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93cfbb2-5a0d-4736-88a1-5658c1030a4b" containerName="oc" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.800205 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93cfbb2-5a0d-4736-88a1-5658c1030a4b" containerName="oc" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.800298 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93cfbb2-5a0d-4736-88a1-5658c1030a4b" containerName="oc" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.800819 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ctqb5" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.803283 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.803880 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-22hcx" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.804095 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.823703 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-hn9sv"] Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.824611 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-hn9sv" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.831149 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qshgb" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.837478 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cdjgb"] Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.838237 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cdjgb" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.840702 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pg8zk" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.844522 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-hn9sv"] Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.849139 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-ctqb5"] Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.859048 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrb9h\" (UniqueName: \"kubernetes.io/projected/0e9160a5-c44c-4f1c-8a83-2b4944b30542-kube-api-access-hrb9h\") pod \"cert-manager-cainjector-cf98fcc89-ctqb5\" (UID: \"0e9160a5-c44c-4f1c-8a83-2b4944b30542\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-ctqb5" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.859304 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj6zg\" (UniqueName: \"kubernetes.io/projected/e202f2fc-57d5-4c62-838e-0835c8add77c-kube-api-access-fj6zg\") pod \"cert-manager-858654f9db-hn9sv\" (UID: \"e202f2fc-57d5-4c62-838e-0835c8add77c\") " pod="cert-manager/cert-manager-858654f9db-hn9sv" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.864196 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cdjgb"] Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.960095 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrb9h\" (UniqueName: \"kubernetes.io/projected/0e9160a5-c44c-4f1c-8a83-2b4944b30542-kube-api-access-hrb9h\") pod \"cert-manager-cainjector-cf98fcc89-ctqb5\" (UID: \"0e9160a5-c44c-4f1c-8a83-2b4944b30542\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-ctqb5" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.960167 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hplm\" (UniqueName: \"kubernetes.io/projected/a78d7fb5-1e8d-45e3-8af0-105378a7c9ae-kube-api-access-8hplm\") pod \"cert-manager-webhook-687f57d79b-cdjgb\" (UID: \"a78d7fb5-1e8d-45e3-8af0-105378a7c9ae\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cdjgb" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.960200 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj6zg\" (UniqueName: \"kubernetes.io/projected/e202f2fc-57d5-4c62-838e-0835c8add77c-kube-api-access-fj6zg\") pod \"cert-manager-858654f9db-hn9sv\" (UID: \"e202f2fc-57d5-4c62-838e-0835c8add77c\") " pod="cert-manager/cert-manager-858654f9db-hn9sv" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.978007 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrb9h\" (UniqueName: \"kubernetes.io/projected/0e9160a5-c44c-4f1c-8a83-2b4944b30542-kube-api-access-hrb9h\") pod \"cert-manager-cainjector-cf98fcc89-ctqb5\" (UID: \"0e9160a5-c44c-4f1c-8a83-2b4944b30542\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-ctqb5" Mar 20 15:03:49 crc kubenswrapper[4764]: I0320 15:03:49.989125 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj6zg\" (UniqueName: \"kubernetes.io/projected/e202f2fc-57d5-4c62-838e-0835c8add77c-kube-api-access-fj6zg\") pod \"cert-manager-858654f9db-hn9sv\" (UID: \"e202f2fc-57d5-4c62-838e-0835c8add77c\") " pod="cert-manager/cert-manager-858654f9db-hn9sv" Mar 20 15:03:50 crc kubenswrapper[4764]: I0320 15:03:50.062092 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hplm\" (UniqueName: \"kubernetes.io/projected/a78d7fb5-1e8d-45e3-8af0-105378a7c9ae-kube-api-access-8hplm\") pod \"cert-manager-webhook-687f57d79b-cdjgb\" (UID: \"a78d7fb5-1e8d-45e3-8af0-105378a7c9ae\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cdjgb" Mar 20 15:03:50 crc kubenswrapper[4764]: I0320 15:03:50.089658 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hplm\" (UniqueName: \"kubernetes.io/projected/a78d7fb5-1e8d-45e3-8af0-105378a7c9ae-kube-api-access-8hplm\") pod \"cert-manager-webhook-687f57d79b-cdjgb\" (UID: \"a78d7fb5-1e8d-45e3-8af0-105378a7c9ae\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cdjgb" Mar 20 15:03:50 crc kubenswrapper[4764]: I0320 15:03:50.119427 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ctqb5" Mar 20 15:03:50 crc kubenswrapper[4764]: I0320 15:03:50.137479 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-hn9sv" Mar 20 15:03:50 crc kubenswrapper[4764]: I0320 15:03:50.160009 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cdjgb" Mar 20 15:03:50 crc kubenswrapper[4764]: I0320 15:03:50.380362 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-hn9sv"] Mar 20 15:03:50 crc kubenswrapper[4764]: I0320 15:03:50.428202 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cdjgb"] Mar 20 15:03:50 crc kubenswrapper[4764]: I0320 15:03:50.661579 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-ctqb5"] Mar 20 15:03:50 crc kubenswrapper[4764]: W0320 15:03:50.663865 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e9160a5_c44c_4f1c_8a83_2b4944b30542.slice/crio-ad448b6f68a031cac8cc3b74399697cf18cd9800fbe9812f77f65bf1c40ab566 WatchSource:0}: Error finding container ad448b6f68a031cac8cc3b74399697cf18cd9800fbe9812f77f65bf1c40ab566: Status 404 returned error can't find the container with id ad448b6f68a031cac8cc3b74399697cf18cd9800fbe9812f77f65bf1c40ab566 Mar 20 15:03:50 crc kubenswrapper[4764]: I0320 15:03:50.674597 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ctqb5" event={"ID":"0e9160a5-c44c-4f1c-8a83-2b4944b30542","Type":"ContainerStarted","Data":"ad448b6f68a031cac8cc3b74399697cf18cd9800fbe9812f77f65bf1c40ab566"} Mar 20 15:03:50 crc kubenswrapper[4764]: I0320 15:03:50.675996 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cdjgb" event={"ID":"a78d7fb5-1e8d-45e3-8af0-105378a7c9ae","Type":"ContainerStarted","Data":"c557c6a8bf613f550b468c40fb219b9b35513829d7cc4aafbf1c2a03a1376636"} Mar 20 15:03:50 crc kubenswrapper[4764]: I0320 15:03:50.677931 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-hn9sv" event={"ID":"e202f2fc-57d5-4c62-838e-0835c8add77c","Type":"ContainerStarted","Data":"1196ea35dd5269f3e99bcc11db928f94175746f1c3c0b5b0b41acf0d9b8a6940"} Mar 20 15:03:55 crc kubenswrapper[4764]: I0320 15:03:55.729896 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cdjgb" event={"ID":"a78d7fb5-1e8d-45e3-8af0-105378a7c9ae","Type":"ContainerStarted","Data":"72e74fc850f9c21b93df8b5a7b7633320af6993dea04eb5ebb9ee767eac2c04f"} Mar 20 15:03:55 crc kubenswrapper[4764]: I0320 15:03:55.730392 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-cdjgb" Mar 20 15:03:55 crc kubenswrapper[4764]: I0320 15:03:55.754176 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-cdjgb" podStartSLOduration=3.089647712 podStartE2EDuration="6.754156963s" podCreationTimestamp="2026-03-20 15:03:49 +0000 UTC" firstStartedPulling="2026-03-20 15:03:50.437958059 +0000 UTC m=+752.054147188" lastFinishedPulling="2026-03-20 15:03:54.10246731 +0000 UTC m=+755.718656439" observedRunningTime="2026-03-20 15:03:55.751004438 +0000 UTC m=+757.367193577" watchObservedRunningTime="2026-03-20 15:03:55.754156963 +0000 UTC m=+757.370346102" Mar 20 15:03:56 crc kubenswrapper[4764]: I0320 15:03:56.739568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ctqb5" event={"ID":"0e9160a5-c44c-4f1c-8a83-2b4944b30542","Type":"ContainerStarted","Data":"afb62a1492509a32d37b397fc7e785a1ea053f433451d7dcb02dc5516db1ebfa"} Mar 20 15:03:56 crc kubenswrapper[4764]: I0320 15:03:56.742466 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-hn9sv" event={"ID":"e202f2fc-57d5-4c62-838e-0835c8add77c","Type":"ContainerStarted","Data":"840e27cad9419a1e4cd8dc63e62a23b67b3b5f2af9ed668c94f0c049769946ef"} Mar 20 15:03:56 crc kubenswrapper[4764]: I0320 15:03:56.766004 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-ctqb5" podStartSLOduration=2.480561388 podStartE2EDuration="7.765982137s" podCreationTimestamp="2026-03-20 15:03:49 +0000 UTC" firstStartedPulling="2026-03-20 15:03:50.668000899 +0000 UTC m=+752.284190028" lastFinishedPulling="2026-03-20 15:03:55.953421648 +0000 UTC m=+757.569610777" observedRunningTime="2026-03-20 15:03:56.762776961 +0000 UTC m=+758.378966130" watchObservedRunningTime="2026-03-20 15:03:56.765982137 +0000 UTC m=+758.382171296" Mar 20 15:03:56 crc kubenswrapper[4764]: I0320 15:03:56.831726 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-hn9sv" podStartSLOduration=2.522753651 podStartE2EDuration="7.831704023s" podCreationTimestamp="2026-03-20 15:03:49 +0000 UTC" firstStartedPulling="2026-03-20 15:03:50.388732273 +0000 UTC m=+752.004921402" lastFinishedPulling="2026-03-20 15:03:55.697682635 +0000 UTC m=+757.313871774" observedRunningTime="2026-03-20 15:03:56.828402903 +0000 UTC m=+758.444592042" watchObservedRunningTime="2026-03-20 15:03:56.831704023 +0000 UTC m=+758.447893152" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.104944 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p5lds"] Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.105610 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovn-controller" containerID="cri-o://90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10" gracePeriod=30 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.105717 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="northd" containerID="cri-o://51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84" gracePeriod=30 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.105682 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="nbdb" containerID="cri-o://bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6" gracePeriod=30 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.105772 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovn-acl-logging" containerID="cri-o://a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500" gracePeriod=30 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.105766 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="sbdb" containerID="cri-o://1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765" gracePeriod=30 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.105754 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73" gracePeriod=30 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.105764 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="kube-rbac-proxy-node" containerID="cri-o://84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4" gracePeriod=30 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.149558 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566984-rtmj8"] Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.151111 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.154816 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.154831 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.155226 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.158955 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566984-rtmj8"] Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.163651 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-cdjgb" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.164615 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" containerID="cri-o://46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e" gracePeriod=30 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.198337 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5hg9\" (UniqueName: \"kubernetes.io/projected/c0526618-5c59-4e4b-a854-cd1d61c50c53-kube-api-access-k5hg9\") pod \"auto-csr-approver-29566984-rtmj8\" (UID: \"c0526618-5c59-4e4b-a854-cd1d61c50c53\") " pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.299879 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5hg9\" (UniqueName: \"kubernetes.io/projected/c0526618-5c59-4e4b-a854-cd1d61c50c53-kube-api-access-k5hg9\") pod \"auto-csr-approver-29566984-rtmj8\" (UID: \"c0526618-5c59-4e4b-a854-cd1d61c50c53\") " pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.316320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5hg9\" (UniqueName: \"kubernetes.io/projected/c0526618-5c59-4e4b-a854-cd1d61c50c53-kube-api-access-k5hg9\") pod \"auto-csr-approver-29566984-rtmj8\" (UID: \"c0526618-5c59-4e4b-a854-cd1d61c50c53\") " pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.472594 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/4.log" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.479813 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/3.log" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.484221 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovn-acl-logging/0.log" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.484791 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovn-controller/0.log" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.485302 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.489557 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502355 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-slash\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502425 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-ovnkube-script-lib\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502456 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-cni-netd\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502447 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-slash" (OuterVolumeSpecName: "host-slash") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502489 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-ovn\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502509 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-var-lib-openvswitch\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502515 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502547 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-kubelet\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502570 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502557 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502618 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-log-socket\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502633 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502641 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-cni-bin\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502651 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-log-socket" (OuterVolumeSpecName: "log-socket") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502659 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502664 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502692 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f2a6c163-0457-4626-9bbb-5628a5155673-ovn-node-metrics-cert\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502725 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-run-ovn-kubernetes\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502730 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502749 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s727g\" (UniqueName: \"kubernetes.io/projected/f2a6c163-0457-4626-9bbb-5628a5155673-kube-api-access-s727g\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502793 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-env-overrides\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502808 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502822 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-node-log\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502847 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-run-netns\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502874 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-openvswitch\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502898 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-ovnkube-config\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502939 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-etc-openvswitch\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502957 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-systemd\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.502980 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-systemd-units\") pod \"f2a6c163-0457-4626-9bbb-5628a5155673\" (UID: \"f2a6c163-0457-4626-9bbb-5628a5155673\") " Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503007 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503051 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503133 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503211 4764 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503227 4764 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503239 4764 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503249 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503259 4764 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503269 4764 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503279 4764 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503288 4764 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503298 4764 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503308 4764 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503318 4764 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503329 4764 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503469 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503502 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-node-log" (OuterVolumeSpecName: "node-log") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503511 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503525 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.503551 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.506159 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2a6c163-0457-4626-9bbb-5628a5155673-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.507416 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a6c163-0457-4626-9bbb-5628a5155673-kube-api-access-s727g" (OuterVolumeSpecName: "kube-api-access-s727g") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "kube-api-access-s727g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.518643 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f2a6c163-0457-4626-9bbb-5628a5155673" (UID: "f2a6c163-0457-4626-9bbb-5628a5155673"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.522032 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566984-rtmj8_openshift-infra_c0526618-5c59-4e4b-a854-cd1d61c50c53_0(c241f7878a203e1a201453f3e4dd6aeccadfec2928cbd92a9c450212b2c2f527): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.522101 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566984-rtmj8_openshift-infra_c0526618-5c59-4e4b-a854-cd1d61c50c53_0(c241f7878a203e1a201453f3e4dd6aeccadfec2928cbd92a9c450212b2c2f527): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.522125 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566984-rtmj8_openshift-infra_c0526618-5c59-4e4b-a854-cd1d61c50c53_0(c241f7878a203e1a201453f3e4dd6aeccadfec2928cbd92a9c450212b2c2f527): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.522183 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29566984-rtmj8_openshift-infra(c0526618-5c59-4e4b-a854-cd1d61c50c53)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29566984-rtmj8_openshift-infra(c0526618-5c59-4e4b-a854-cd1d61c50c53)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566984-rtmj8_openshift-infra_c0526618-5c59-4e4b-a854-cd1d61c50c53_0(c241f7878a203e1a201453f3e4dd6aeccadfec2928cbd92a9c450212b2c2f527): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" podUID="c0526618-5c59-4e4b-a854-cd1d61c50c53" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535430 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-h7ss4"] Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.535634 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="sbdb" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535645 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="sbdb" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.535657 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535663 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.535670 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535676 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.535683 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535688 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.535694 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovn-acl-logging" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535699 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovn-acl-logging" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.535705 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovn-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535711 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovn-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.535720 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535725 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.535741 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="kube-rbac-proxy-node" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535747 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="kube-rbac-proxy-node" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.535755 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="kubecfg-setup" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535760 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="kubecfg-setup" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.535766 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="nbdb" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535772 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="nbdb" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.535778 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="northd" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535784 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="northd" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535881 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovn-acl-logging" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535890 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535895 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovn-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535903 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="northd" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535911 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="sbdb" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535919 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535926 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="kube-rbac-proxy-node" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535932 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535939 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535946 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.535998 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="nbdb" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.536093 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.536100 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.536109 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.536114 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.536193 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" containerName="ovnkube-controller" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.538118 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604046 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-var-lib-openvswitch\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604092 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-run-systemd\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604117 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-kubelet\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604135 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-cni-netd\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604169 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-etc-openvswitch\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604202 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b35270ee-14db-4f71-9a74-7246e3c3f465-ovn-node-metrics-cert\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604219 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z795s\" (UniqueName: \"kubernetes.io/projected/b35270ee-14db-4f71-9a74-7246e3c3f465-kube-api-access-z795s\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604238 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b35270ee-14db-4f71-9a74-7246e3c3f465-ovnkube-script-lib\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604255 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b35270ee-14db-4f71-9a74-7246e3c3f465-env-overrides\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604294 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604321 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-run-netns\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604340 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-node-log\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604354 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-log-socket\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604521 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-slash\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604569 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b35270ee-14db-4f71-9a74-7246e3c3f465-ovnkube-config\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604594 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-run-openvswitch\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604632 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-systemd-units\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604711 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-run-ovn\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604745 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-run-ovn-kubernetes\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604765 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-cni-bin\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604836 4764 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604847 4764 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604861 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f2a6c163-0457-4626-9bbb-5628a5155673-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604870 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s727g\" (UniqueName: \"kubernetes.io/projected/f2a6c163-0457-4626-9bbb-5628a5155673-kube-api-access-s727g\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604881 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604889 4764 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604897 4764 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f2a6c163-0457-4626-9bbb-5628a5155673-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.604909 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f2a6c163-0457-4626-9bbb-5628a5155673-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706278 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-systemd-units\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706596 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-run-ovn\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706621 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-run-ovn-kubernetes\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-cni-bin\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706653 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-var-lib-openvswitch\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706683 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-run-ovn\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706703 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-run-systemd\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706707 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-run-ovn-kubernetes\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706731 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-var-lib-openvswitch\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706746 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-kubelet\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706799 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-cni-netd\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706817 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-etc-openvswitch\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706438 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-systemd-units\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706765 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-kubelet\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706746 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-cni-bin\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706888 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-cni-netd\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706762 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-run-systemd\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.706952 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-etc-openvswitch\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.707037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z795s\" (UniqueName: \"kubernetes.io/projected/b35270ee-14db-4f71-9a74-7246e3c3f465-kube-api-access-z795s\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.707061 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b35270ee-14db-4f71-9a74-7246e3c3f465-ovn-node-metrics-cert\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.707079 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b35270ee-14db-4f71-9a74-7246e3c3f465-ovnkube-script-lib\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.707136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b35270ee-14db-4f71-9a74-7246e3c3f465-env-overrides\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.707277 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.707156 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.707938 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-run-netns\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.707856 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b35270ee-14db-4f71-9a74-7246e3c3f465-env-overrides\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.707980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b35270ee-14db-4f71-9a74-7246e3c3f465-ovnkube-script-lib\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.707981 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-node-log\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.708014 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-run-netns\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.708035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-node-log\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.708064 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-log-socket\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.708104 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-slash\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.708133 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b35270ee-14db-4f71-9a74-7246e3c3f465-ovnkube-config\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.708130 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-log-socket\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.708172 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-run-openvswitch\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.708153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-run-openvswitch\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.708209 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b35270ee-14db-4f71-9a74-7246e3c3f465-host-slash\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.708682 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b35270ee-14db-4f71-9a74-7246e3c3f465-ovnkube-config\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.710765 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b35270ee-14db-4f71-9a74-7246e3c3f465-ovn-node-metrics-cert\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.722886 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z795s\" (UniqueName: \"kubernetes.io/projected/b35270ee-14db-4f71-9a74-7246e3c3f465-kube-api-access-z795s\") pod \"ovnkube-node-h7ss4\" (UID: \"b35270ee-14db-4f71-9a74-7246e3c3f465\") " pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.770114 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4m5r_1f85a77d-475e-43c9-8181-093451bc058f/kube-multus/2.log" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.770844 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4m5r_1f85a77d-475e-43c9-8181-093451bc058f/kube-multus/1.log" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.770884 4764 generic.go:334] "Generic (PLEG): container finished" podID="1f85a77d-475e-43c9-8181-093451bc058f" containerID="3672cc3a563a8bf393194d9c28a5c0bf757103d69c941de1407add1cb9efe136" exitCode=2 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.770931 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4m5r" event={"ID":"1f85a77d-475e-43c9-8181-093451bc058f","Type":"ContainerDied","Data":"3672cc3a563a8bf393194d9c28a5c0bf757103d69c941de1407add1cb9efe136"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.771197 4764 scope.go:117] "RemoveContainer" containerID="7a45deaf09c399dfadd97c92dbcb7cb9d94d21f2d39069e2f4b7de45135f4abc" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.771893 4764 scope.go:117] "RemoveContainer" containerID="3672cc3a563a8bf393194d9c28a5c0bf757103d69c941de1407add1cb9efe136" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.772301 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-d4m5r_openshift-multus(1f85a77d-475e-43c9-8181-093451bc058f)\"" pod="openshift-multus/multus-d4m5r" podUID="1f85a77d-475e-43c9-8181-093451bc058f" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.773633 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/4.log" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.774486 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovnkube-controller/3.log" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.785129 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovn-acl-logging/0.log" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.785882 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p5lds_f2a6c163-0457-4626-9bbb-5628a5155673/ovn-controller/0.log" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.786855 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e" exitCode=2 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.786900 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765" exitCode=0 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.786916 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6" exitCode=0 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.786932 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84" exitCode=0 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.786920 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787001 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787015 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787221 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.786950 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73" exitCode=0 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787308 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4" exitCode=0 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787327 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500" exitCode=143 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787351 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2a6c163-0457-4626-9bbb-5628a5155673" containerID="90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10" exitCode=143 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787330 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787443 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787460 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787474 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787485 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787495 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787506 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787516 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787526 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787536 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787546 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787571 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787590 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787604 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787615 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787626 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787637 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787648 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787659 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787669 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787680 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787690 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787705 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787723 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787735 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787746 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787757 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787768 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787779 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787789 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787799 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787809 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787821 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p5lds" event={"ID":"f2a6c163-0457-4626-9bbb-5628a5155673","Type":"ContainerDied","Data":"429a65d617a759fd722f43c9c099401deaf966bf0ca3a152452cfd1c3a335920"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787864 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787876 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787887 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787897 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787908 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787919 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787929 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787944 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787962 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787986 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016"} Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.787821 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.794970 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.827765 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566984-rtmj8_openshift-infra_c0526618-5c59-4e4b-a854-cd1d61c50c53_0(fd170097e74343470cb6c7dd45c28f03276118148d814b1ad96f602ca1cfdfa7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.827908 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566984-rtmj8_openshift-infra_c0526618-5c59-4e4b-a854-cd1d61c50c53_0(fd170097e74343470cb6c7dd45c28f03276118148d814b1ad96f602ca1cfdfa7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.828003 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566984-rtmj8_openshift-infra_c0526618-5c59-4e4b-a854-cd1d61c50c53_0(fd170097e74343470cb6c7dd45c28f03276118148d814b1ad96f602ca1cfdfa7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:00 crc kubenswrapper[4764]: E0320 15:04:00.828108 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29566984-rtmj8_openshift-infra(c0526618-5c59-4e4b-a854-cd1d61c50c53)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29566984-rtmj8_openshift-infra(c0526618-5c59-4e4b-a854-cd1d61c50c53)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566984-rtmj8_openshift-infra_c0526618-5c59-4e4b-a854-cd1d61c50c53_0(fd170097e74343470cb6c7dd45c28f03276118148d814b1ad96f602ca1cfdfa7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" podUID="c0526618-5c59-4e4b-a854-cd1d61c50c53" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.849032 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p5lds"] Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.851637 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.857317 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p5lds"] Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.877814 4764 scope.go:117] "RemoveContainer" containerID="46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e" Mar 20 15:04:00 crc kubenswrapper[4764]: W0320 15:04:00.887682 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb35270ee_14db_4f71_9a74_7246e3c3f465.slice/crio-a90eec534ff45b5e9d1f75a0a3bf2ab62aa5b40d4a5ac03235d8736a9816c262 WatchSource:0}: Error finding container a90eec534ff45b5e9d1f75a0a3bf2ab62aa5b40d4a5ac03235d8736a9816c262: Status 404 returned error can't find the container with id a90eec534ff45b5e9d1f75a0a3bf2ab62aa5b40d4a5ac03235d8736a9816c262 Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.898538 4764 scope.go:117] "RemoveContainer" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.922616 4764 scope.go:117] "RemoveContainer" containerID="1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.942030 4764 scope.go:117] "RemoveContainer" containerID="bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.961442 4764 scope.go:117] "RemoveContainer" containerID="51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84" Mar 20 15:04:00 crc kubenswrapper[4764]: I0320 15:04:00.982858 4764 scope.go:117] "RemoveContainer" containerID="bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.005494 4764 scope.go:117] "RemoveContainer" containerID="84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.020219 4764 scope.go:117] "RemoveContainer" containerID="a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.033616 4764 scope.go:117] "RemoveContainer" containerID="90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.048073 4764 scope.go:117] "RemoveContainer" containerID="5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.065537 4764 scope.go:117] "RemoveContainer" containerID="46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e" Mar 20 15:04:01 crc kubenswrapper[4764]: E0320 15:04:01.069202 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e\": container with ID starting with 46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e not found: ID does not exist" containerID="46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.069258 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e"} err="failed to get container status \"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e\": rpc error: code = NotFound desc = could not find container \"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e\": container with ID starting with 46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.069280 4764 scope.go:117] "RemoveContainer" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" Mar 20 15:04:01 crc kubenswrapper[4764]: E0320 15:04:01.070190 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\": container with ID starting with 4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43 not found: ID does not exist" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.070213 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43"} err="failed to get container status \"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\": rpc error: code = NotFound desc = could not find container \"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\": container with ID starting with 4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.070231 4764 scope.go:117] "RemoveContainer" containerID="1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765" Mar 20 15:04:01 crc kubenswrapper[4764]: E0320 15:04:01.070506 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\": container with ID starting with 1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765 not found: ID does not exist" containerID="1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.070563 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765"} err="failed to get container status \"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\": rpc error: code = NotFound desc = could not find container \"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\": container with ID starting with 1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.070578 4764 scope.go:117] "RemoveContainer" containerID="bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6" Mar 20 15:04:01 crc kubenswrapper[4764]: E0320 15:04:01.071148 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\": container with ID starting with bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6 not found: ID does not exist" containerID="bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.071186 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6"} err="failed to get container status \"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\": rpc error: code = NotFound desc = could not find container \"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\": container with ID starting with bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.071204 4764 scope.go:117] "RemoveContainer" containerID="51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84" Mar 20 15:04:01 crc kubenswrapper[4764]: E0320 15:04:01.071750 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\": container with ID starting with 51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84 not found: ID does not exist" containerID="51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.071788 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84"} err="failed to get container status \"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\": rpc error: code = NotFound desc = could not find container \"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\": container with ID starting with 51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.071802 4764 scope.go:117] "RemoveContainer" containerID="bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73" Mar 20 15:04:01 crc kubenswrapper[4764]: E0320 15:04:01.072103 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\": container with ID starting with bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73 not found: ID does not exist" containerID="bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.072142 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73"} err="failed to get container status \"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\": rpc error: code = NotFound desc = could not find container \"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\": container with ID starting with bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.072155 4764 scope.go:117] "RemoveContainer" containerID="84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4" Mar 20 15:04:01 crc kubenswrapper[4764]: E0320 15:04:01.072405 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\": container with ID starting with 84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4 not found: ID does not exist" containerID="84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.072442 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4"} err="failed to get container status \"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\": rpc error: code = NotFound desc = could not find container \"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\": container with ID starting with 84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.072492 4764 scope.go:117] "RemoveContainer" containerID="a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500" Mar 20 15:04:01 crc kubenswrapper[4764]: E0320 15:04:01.072738 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\": container with ID starting with a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500 not found: ID does not exist" containerID="a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.072812 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500"} err="failed to get container status \"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\": rpc error: code = NotFound desc = could not find container \"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\": container with ID starting with a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.072829 4764 scope.go:117] "RemoveContainer" containerID="90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10" Mar 20 15:04:01 crc kubenswrapper[4764]: E0320 15:04:01.073177 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\": container with ID starting with 90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10 not found: ID does not exist" containerID="90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.073194 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10"} err="failed to get container status \"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\": rpc error: code = NotFound desc = could not find container \"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\": container with ID starting with 90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.073227 4764 scope.go:117] "RemoveContainer" containerID="5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016" Mar 20 15:04:01 crc kubenswrapper[4764]: E0320 15:04:01.073495 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\": container with ID starting with 5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016 not found: ID does not exist" containerID="5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.073511 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016"} err="failed to get container status \"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\": rpc error: code = NotFound desc = could not find container \"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\": container with ID starting with 5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.073560 4764 scope.go:117] "RemoveContainer" containerID="46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.073764 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e"} err="failed to get container status \"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e\": rpc error: code = NotFound desc = could not find container \"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e\": container with ID starting with 46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.073784 4764 scope.go:117] "RemoveContainer" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.073939 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43"} err="failed to get container status \"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\": rpc error: code = NotFound desc = could not find container \"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\": container with ID starting with 4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.073958 4764 scope.go:117] "RemoveContainer" containerID="1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.074342 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765"} err="failed to get container status \"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\": rpc error: code = NotFound desc = could not find container \"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\": container with ID starting with 1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.074364 4764 scope.go:117] "RemoveContainer" containerID="bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.074593 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6"} err="failed to get container status \"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\": rpc error: code = NotFound desc = could not find container \"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\": container with ID starting with bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.074612 4764 scope.go:117] "RemoveContainer" containerID="51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.074875 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84"} err="failed to get container status \"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\": rpc error: code = NotFound desc = could not find container \"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\": container with ID starting with 51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.074889 4764 scope.go:117] "RemoveContainer" containerID="bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.075282 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73"} err="failed to get container status \"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\": rpc error: code = NotFound desc = could not find container \"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\": container with ID starting with bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.075297 4764 scope.go:117] "RemoveContainer" containerID="84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.075564 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4"} err="failed to get container status \"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\": rpc error: code = NotFound desc = could not find container \"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\": container with ID starting with 84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.075601 4764 scope.go:117] "RemoveContainer" containerID="a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.075824 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500"} err="failed to get container status \"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\": rpc error: code = NotFound desc = could not find container \"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\": container with ID starting with a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.075842 4764 scope.go:117] "RemoveContainer" containerID="90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.076005 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10"} err="failed to get container status \"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\": rpc error: code = NotFound desc = could not find container \"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\": container with ID starting with 90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.076023 4764 scope.go:117] "RemoveContainer" containerID="5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.076195 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016"} err="failed to get container status \"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\": rpc error: code = NotFound desc = could not find container \"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\": container with ID starting with 5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.076218 4764 scope.go:117] "RemoveContainer" containerID="46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.076369 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e"} err="failed to get container status \"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e\": rpc error: code = NotFound desc = could not find container \"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e\": container with ID starting with 46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.076398 4764 scope.go:117] "RemoveContainer" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.076635 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43"} err="failed to get container status \"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\": rpc error: code = NotFound desc = could not find container \"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\": container with ID starting with 4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.076657 4764 scope.go:117] "RemoveContainer" containerID="1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.076908 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765"} err="failed to get container status \"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\": rpc error: code = NotFound desc = could not find container \"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\": container with ID starting with 1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.076942 4764 scope.go:117] "RemoveContainer" containerID="bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.077197 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6"} err="failed to get container status \"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\": rpc error: code = NotFound desc = could not find container \"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\": container with ID starting with bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.077213 4764 scope.go:117] "RemoveContainer" containerID="51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.077444 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84"} err="failed to get container status \"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\": rpc error: code = NotFound desc = could not find container \"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\": container with ID starting with 51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.077458 4764 scope.go:117] "RemoveContainer" containerID="bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.077668 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73"} err="failed to get container status \"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\": rpc error: code = NotFound desc = could not find container \"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\": container with ID starting with bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.077690 4764 scope.go:117] "RemoveContainer" containerID="84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.077874 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4"} err="failed to get container status \"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\": rpc error: code = NotFound desc = could not find container \"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\": container with ID starting with 84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.077897 4764 scope.go:117] "RemoveContainer" containerID="a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.078061 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500"} err="failed to get container status \"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\": rpc error: code = NotFound desc = could not find container \"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\": container with ID starting with a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.078098 4764 scope.go:117] "RemoveContainer" containerID="90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.078299 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10"} err="failed to get container status \"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\": rpc error: code = NotFound desc = could not find container \"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\": container with ID starting with 90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.078320 4764 scope.go:117] "RemoveContainer" containerID="5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.078524 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016"} err="failed to get container status \"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\": rpc error: code = NotFound desc = could not find container \"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\": container with ID starting with 5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.078541 4764 scope.go:117] "RemoveContainer" containerID="46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.078784 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e"} err="failed to get container status \"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e\": rpc error: code = NotFound desc = could not find container \"46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e\": container with ID starting with 46e7c8c8255c116dc50c0d5e398685b7ab6eecf1d63c6a6d6372915f8adc564e not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.078847 4764 scope.go:117] "RemoveContainer" containerID="4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.079729 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43"} err="failed to get container status \"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\": rpc error: code = NotFound desc = could not find container \"4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43\": container with ID starting with 4c004a60ab5a3c96ec6131e707a4d958ea568097848e861f2a25bdc0057d1d43 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.079748 4764 scope.go:117] "RemoveContainer" containerID="1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.080127 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765"} err="failed to get container status \"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\": rpc error: code = NotFound desc = could not find container \"1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765\": container with ID starting with 1cee2f98c0dc44542fcc3aa7011a39517ba24796baa61bbc9e5b2aaaee5fb765 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.080142 4764 scope.go:117] "RemoveContainer" containerID="bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.080494 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6"} err="failed to get container status \"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\": rpc error: code = NotFound desc = could not find container \"bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6\": container with ID starting with bd9453652f5f929ed56f4ac68c63269211cd3256b61ce0c991256794250319d6 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.080513 4764 scope.go:117] "RemoveContainer" containerID="51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.080972 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84"} err="failed to get container status \"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\": rpc error: code = NotFound desc = could not find container \"51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84\": container with ID starting with 51c03343d99a7f28ae25b3978aababea30657ee6e3fea25a4da9a0b854479f84 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.080992 4764 scope.go:117] "RemoveContainer" containerID="bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.081544 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73"} err="failed to get container status \"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\": rpc error: code = NotFound desc = could not find container \"bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73\": container with ID starting with bc41d632789d76937ff7e58b3e4056a244292e05d86a0c1bf452175f91a56b73 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.081560 4764 scope.go:117] "RemoveContainer" containerID="84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.081877 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4"} err="failed to get container status \"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\": rpc error: code = NotFound desc = could not find container \"84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4\": container with ID starting with 84bcb890a0e94889e28a7334e39228654c8b01005bab5eb089dad87482ac64d4 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.081920 4764 scope.go:117] "RemoveContainer" containerID="a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.082476 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500"} err="failed to get container status \"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\": rpc error: code = NotFound desc = could not find container \"a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500\": container with ID starting with a0946781bdbb2c1d0990aa87342a0a876f8c7bddc519f906e5545dbea659f500 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.082492 4764 scope.go:117] "RemoveContainer" containerID="90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.082761 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10"} err="failed to get container status \"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\": rpc error: code = NotFound desc = could not find container \"90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10\": container with ID starting with 90a1184983d80d51653b5e5280650f8c8986c7dd46b76250c38964b8ade63e10 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.082778 4764 scope.go:117] "RemoveContainer" containerID="5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.083126 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016"} err="failed to get container status \"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\": rpc error: code = NotFound desc = could not find container \"5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016\": container with ID starting with 5e49656a1516fe789838ccc88fa4a48867e52c972c3ffd2b943a64861642e016 not found: ID does not exist" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.133447 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2a6c163-0457-4626-9bbb-5628a5155673" path="/var/lib/kubelet/pods/f2a6c163-0457-4626-9bbb-5628a5155673/volumes" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.795964 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4m5r_1f85a77d-475e-43c9-8181-093451bc058f/kube-multus/2.log" Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.797850 4764 generic.go:334] "Generic (PLEG): container finished" podID="b35270ee-14db-4f71-9a74-7246e3c3f465" containerID="5d1461312f732848f03c02ae747de30cf8644693bd4b60a8409f57a6f7841c08" exitCode=0 Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.797899 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" event={"ID":"b35270ee-14db-4f71-9a74-7246e3c3f465","Type":"ContainerDied","Data":"5d1461312f732848f03c02ae747de30cf8644693bd4b60a8409f57a6f7841c08"} Mar 20 15:04:01 crc kubenswrapper[4764]: I0320 15:04:01.797959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" event={"ID":"b35270ee-14db-4f71-9a74-7246e3c3f465","Type":"ContainerStarted","Data":"a90eec534ff45b5e9d1f75a0a3bf2ab62aa5b40d4a5ac03235d8736a9816c262"} Mar 20 15:04:02 crc kubenswrapper[4764]: I0320 15:04:02.810530 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" event={"ID":"b35270ee-14db-4f71-9a74-7246e3c3f465","Type":"ContainerStarted","Data":"c8bdee127660689482d004adc5bb905f551612efa5b2db6e3336169e25733174"} Mar 20 15:04:02 crc kubenswrapper[4764]: I0320 15:04:02.811107 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" event={"ID":"b35270ee-14db-4f71-9a74-7246e3c3f465","Type":"ContainerStarted","Data":"652d0f08d3d2d5dd505e5da10ebf945525fee71d1e26bfa89f632d0a1e709e85"} Mar 20 15:04:02 crc kubenswrapper[4764]: I0320 15:04:02.811125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" event={"ID":"b35270ee-14db-4f71-9a74-7246e3c3f465","Type":"ContainerStarted","Data":"689bd26b457e71ecfb455eaf6a17c07b692c588624ee356205405b5cae0189b1"} Mar 20 15:04:02 crc kubenswrapper[4764]: I0320 15:04:02.811137 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" event={"ID":"b35270ee-14db-4f71-9a74-7246e3c3f465","Type":"ContainerStarted","Data":"d0199f4f0d80a50244a60581bbfeef2bd19140a805899f99f7b451b9c5efff45"} Mar 20 15:04:02 crc kubenswrapper[4764]: I0320 15:04:02.811149 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" event={"ID":"b35270ee-14db-4f71-9a74-7246e3c3f465","Type":"ContainerStarted","Data":"48d143749831429ef691c83d86639cd76fe035305cb59af2e5530193fb955c00"} Mar 20 15:04:02 crc kubenswrapper[4764]: I0320 15:04:02.811159 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" event={"ID":"b35270ee-14db-4f71-9a74-7246e3c3f465","Type":"ContainerStarted","Data":"75f171c454fa25a6caf75fb0e1b0e9f6a153d83d3bbaeba4c1c7cefc6e829251"} Mar 20 15:04:04 crc kubenswrapper[4764]: I0320 15:04:04.829611 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" event={"ID":"b35270ee-14db-4f71-9a74-7246e3c3f465","Type":"ContainerStarted","Data":"b272db713fbafc721856218b030fb8995f9b3254625fa9452b6b2bb76f964520"} Mar 20 15:04:07 crc kubenswrapper[4764]: I0320 15:04:07.880983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" event={"ID":"b35270ee-14db-4f71-9a74-7246e3c3f465","Type":"ContainerStarted","Data":"0a4624b5ed6f145ece692c2be4425a77eea7be161b65b33f25b78c1ceef8cc5f"} Mar 20 15:04:07 crc kubenswrapper[4764]: I0320 15:04:07.881665 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:07 crc kubenswrapper[4764]: I0320 15:04:07.881714 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:07 crc kubenswrapper[4764]: I0320 15:04:07.881729 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:07 crc kubenswrapper[4764]: I0320 15:04:07.914134 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:07 crc kubenswrapper[4764]: I0320 15:04:07.926209 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:07 crc kubenswrapper[4764]: I0320 15:04:07.931765 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" podStartSLOduration=7.931730468 podStartE2EDuration="7.931730468s" podCreationTimestamp="2026-03-20 15:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:04:07.922751039 +0000 UTC m=+769.538940208" watchObservedRunningTime="2026-03-20 15:04:07.931730468 +0000 UTC m=+769.547919647" Mar 20 15:04:08 crc kubenswrapper[4764]: I0320 15:04:08.444650 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:04:08 crc kubenswrapper[4764]: I0320 15:04:08.445145 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:04:12 crc kubenswrapper[4764]: I0320 15:04:12.127280 4764 scope.go:117] "RemoveContainer" containerID="3672cc3a563a8bf393194d9c28a5c0bf757103d69c941de1407add1cb9efe136" Mar 20 15:04:12 crc kubenswrapper[4764]: E0320 15:04:12.127940 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-d4m5r_openshift-multus(1f85a77d-475e-43c9-8181-093451bc058f)\"" pod="openshift-multus/multus-d4m5r" podUID="1f85a77d-475e-43c9-8181-093451bc058f" Mar 20 15:04:14 crc kubenswrapper[4764]: I0320 15:04:14.125723 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:14 crc kubenswrapper[4764]: I0320 15:04:14.127024 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:14 crc kubenswrapper[4764]: E0320 15:04:14.169566 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566984-rtmj8_openshift-infra_c0526618-5c59-4e4b-a854-cd1d61c50c53_0(f23d0316979960cb9b718902dc6ca91de24c7d923d715788bd039f324f92ac02): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 15:04:14 crc kubenswrapper[4764]: E0320 15:04:14.170067 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566984-rtmj8_openshift-infra_c0526618-5c59-4e4b-a854-cd1d61c50c53_0(f23d0316979960cb9b718902dc6ca91de24c7d923d715788bd039f324f92ac02): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:14 crc kubenswrapper[4764]: E0320 15:04:14.170116 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566984-rtmj8_openshift-infra_c0526618-5c59-4e4b-a854-cd1d61c50c53_0(f23d0316979960cb9b718902dc6ca91de24c7d923d715788bd039f324f92ac02): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:14 crc kubenswrapper[4764]: E0320 15:04:14.170209 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29566984-rtmj8_openshift-infra(c0526618-5c59-4e4b-a854-cd1d61c50c53)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29566984-rtmj8_openshift-infra(c0526618-5c59-4e4b-a854-cd1d61c50c53)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566984-rtmj8_openshift-infra_c0526618-5c59-4e4b-a854-cd1d61c50c53_0(f23d0316979960cb9b718902dc6ca91de24c7d923d715788bd039f324f92ac02): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" podUID="c0526618-5c59-4e4b-a854-cd1d61c50c53" Mar 20 15:04:27 crc kubenswrapper[4764]: I0320 15:04:27.126079 4764 scope.go:117] "RemoveContainer" containerID="3672cc3a563a8bf393194d9c28a5c0bf757103d69c941de1407add1cb9efe136" Mar 20 15:04:28 crc kubenswrapper[4764]: I0320 15:04:28.018555 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4m5r_1f85a77d-475e-43c9-8181-093451bc058f/kube-multus/2.log" Mar 20 15:04:28 crc kubenswrapper[4764]: I0320 15:04:28.019124 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4m5r" event={"ID":"1f85a77d-475e-43c9-8181-093451bc058f","Type":"ContainerStarted","Data":"a828c84a39f0857a8e71cfbc51068f23348c180d139413ddcbb36a58bc9dd0a0"} Mar 20 15:04:29 crc kubenswrapper[4764]: I0320 15:04:29.131751 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:29 crc kubenswrapper[4764]: I0320 15:04:29.134469 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:29 crc kubenswrapper[4764]: I0320 15:04:29.354647 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566984-rtmj8"] Mar 20 15:04:30 crc kubenswrapper[4764]: I0320 15:04:30.031133 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" event={"ID":"c0526618-5c59-4e4b-a854-cd1d61c50c53","Type":"ContainerStarted","Data":"41d4fd94b695d8b5cd060c5c6b80319ca2db0e222a23445e834268f3978cb312"} Mar 20 15:04:30 crc kubenswrapper[4764]: I0320 15:04:30.883943 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h7ss4" Mar 20 15:04:31 crc kubenswrapper[4764]: I0320 15:04:31.039419 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" event={"ID":"c0526618-5c59-4e4b-a854-cd1d61c50c53","Type":"ContainerStarted","Data":"7c79250be52e18ff86bd91d4d30432738d4258e40ff75bbeb4bceb81b5793d72"} Mar 20 15:04:31 crc kubenswrapper[4764]: I0320 15:04:31.053029 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" podStartSLOduration=29.718413857 podStartE2EDuration="31.053006852s" podCreationTimestamp="2026-03-20 15:04:00 +0000 UTC" firstStartedPulling="2026-03-20 15:04:29.368189611 +0000 UTC m=+790.984378760" lastFinishedPulling="2026-03-20 15:04:30.702782586 +0000 UTC m=+792.318971755" observedRunningTime="2026-03-20 15:04:31.051062652 +0000 UTC m=+792.667251781" watchObservedRunningTime="2026-03-20 15:04:31.053006852 +0000 UTC m=+792.669195991" Mar 20 15:04:32 crc kubenswrapper[4764]: I0320 15:04:32.046748 4764 generic.go:334] "Generic (PLEG): container finished" podID="c0526618-5c59-4e4b-a854-cd1d61c50c53" containerID="7c79250be52e18ff86bd91d4d30432738d4258e40ff75bbeb4bceb81b5793d72" exitCode=0 Mar 20 15:04:32 crc kubenswrapper[4764]: I0320 15:04:32.046836 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" event={"ID":"c0526618-5c59-4e4b-a854-cd1d61c50c53","Type":"ContainerDied","Data":"7c79250be52e18ff86bd91d4d30432738d4258e40ff75bbeb4bceb81b5793d72"} Mar 20 15:04:33 crc kubenswrapper[4764]: I0320 15:04:33.783212 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:33 crc kubenswrapper[4764]: I0320 15:04:33.798328 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5hg9\" (UniqueName: \"kubernetes.io/projected/c0526618-5c59-4e4b-a854-cd1d61c50c53-kube-api-access-k5hg9\") pod \"c0526618-5c59-4e4b-a854-cd1d61c50c53\" (UID: \"c0526618-5c59-4e4b-a854-cd1d61c50c53\") " Mar 20 15:04:33 crc kubenswrapper[4764]: I0320 15:04:33.809057 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0526618-5c59-4e4b-a854-cd1d61c50c53-kube-api-access-k5hg9" (OuterVolumeSpecName: "kube-api-access-k5hg9") pod "c0526618-5c59-4e4b-a854-cd1d61c50c53" (UID: "c0526618-5c59-4e4b-a854-cd1d61c50c53"). InnerVolumeSpecName "kube-api-access-k5hg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:04:33 crc kubenswrapper[4764]: I0320 15:04:33.899951 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5hg9\" (UniqueName: \"kubernetes.io/projected/c0526618-5c59-4e4b-a854-cd1d61c50c53-kube-api-access-k5hg9\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:34 crc kubenswrapper[4764]: I0320 15:04:34.065250 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" event={"ID":"c0526618-5c59-4e4b-a854-cd1d61c50c53","Type":"ContainerDied","Data":"41d4fd94b695d8b5cd060c5c6b80319ca2db0e222a23445e834268f3978cb312"} Mar 20 15:04:34 crc kubenswrapper[4764]: I0320 15:04:34.065313 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41d4fd94b695d8b5cd060c5c6b80319ca2db0e222a23445e834268f3978cb312" Mar 20 15:04:34 crc kubenswrapper[4764]: I0320 15:04:34.065471 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566984-rtmj8" Mar 20 15:04:34 crc kubenswrapper[4764]: I0320 15:04:34.119666 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566978-kl2wd"] Mar 20 15:04:34 crc kubenswrapper[4764]: I0320 15:04:34.127479 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566978-kl2wd"] Mar 20 15:04:35 crc kubenswrapper[4764]: I0320 15:04:35.139047 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc39d516-80da-4091-842b-2bcef48bcc57" path="/var/lib/kubelet/pods/bc39d516-80da-4091-842b-2bcef48bcc57/volumes" Mar 20 15:04:38 crc kubenswrapper[4764]: I0320 15:04:38.444017 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:04:38 crc kubenswrapper[4764]: I0320 15:04:38.444279 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.246678 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th"] Mar 20 15:04:39 crc kubenswrapper[4764]: E0320 15:04:39.246922 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0526618-5c59-4e4b-a854-cd1d61c50c53" containerName="oc" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.246936 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0526618-5c59-4e4b-a854-cd1d61c50c53" containerName="oc" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.247042 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0526618-5c59-4e4b-a854-cd1d61c50c53" containerName="oc" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.247874 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.250725 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.266492 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th"] Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.371872 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66364f10-f71c-48d8-9691-6b840a589609-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th\" (UID: \"66364f10-f71c-48d8-9691-6b840a589609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.371936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66364f10-f71c-48d8-9691-6b840a589609-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th\" (UID: \"66364f10-f71c-48d8-9691-6b840a589609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.372121 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frrs5\" (UniqueName: \"kubernetes.io/projected/66364f10-f71c-48d8-9691-6b840a589609-kube-api-access-frrs5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th\" (UID: \"66364f10-f71c-48d8-9691-6b840a589609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.474303 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66364f10-f71c-48d8-9691-6b840a589609-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th\" (UID: \"66364f10-f71c-48d8-9691-6b840a589609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.474371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66364f10-f71c-48d8-9691-6b840a589609-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th\" (UID: \"66364f10-f71c-48d8-9691-6b840a589609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.474487 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frrs5\" (UniqueName: \"kubernetes.io/projected/66364f10-f71c-48d8-9691-6b840a589609-kube-api-access-frrs5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th\" (UID: \"66364f10-f71c-48d8-9691-6b840a589609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.475130 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66364f10-f71c-48d8-9691-6b840a589609-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th\" (UID: \"66364f10-f71c-48d8-9691-6b840a589609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.475161 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66364f10-f71c-48d8-9691-6b840a589609-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th\" (UID: \"66364f10-f71c-48d8-9691-6b840a589609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.505216 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frrs5\" (UniqueName: \"kubernetes.io/projected/66364f10-f71c-48d8-9691-6b840a589609-kube-api-access-frrs5\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th\" (UID: \"66364f10-f71c-48d8-9691-6b840a589609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.563278 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:39 crc kubenswrapper[4764]: I0320 15:04:39.838067 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th"] Mar 20 15:04:39 crc kubenswrapper[4764]: W0320 15:04:39.846360 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66364f10_f71c_48d8_9691_6b840a589609.slice/crio-635b9bfc0578a61e951d9bbc987f470e742ba9217e84a85ce3963c747f6a13e7 WatchSource:0}: Error finding container 635b9bfc0578a61e951d9bbc987f470e742ba9217e84a85ce3963c747f6a13e7: Status 404 returned error can't find the container with id 635b9bfc0578a61e951d9bbc987f470e742ba9217e84a85ce3963c747f6a13e7 Mar 20 15:04:40 crc kubenswrapper[4764]: I0320 15:04:40.109750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" event={"ID":"66364f10-f71c-48d8-9691-6b840a589609","Type":"ContainerStarted","Data":"f3462195da47bc8a5284c0f7ceb190ac5f653ded8a7a2301c91217dc76b88898"} Mar 20 15:04:40 crc kubenswrapper[4764]: I0320 15:04:40.110274 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" event={"ID":"66364f10-f71c-48d8-9691-6b840a589609","Type":"ContainerStarted","Data":"635b9bfc0578a61e951d9bbc987f470e742ba9217e84a85ce3963c747f6a13e7"} Mar 20 15:04:41 crc kubenswrapper[4764]: I0320 15:04:41.119481 4764 generic.go:334] "Generic (PLEG): container finished" podID="66364f10-f71c-48d8-9691-6b840a589609" containerID="f3462195da47bc8a5284c0f7ceb190ac5f653ded8a7a2301c91217dc76b88898" exitCode=0 Mar 20 15:04:41 crc kubenswrapper[4764]: I0320 15:04:41.119531 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" event={"ID":"66364f10-f71c-48d8-9691-6b840a589609","Type":"ContainerDied","Data":"f3462195da47bc8a5284c0f7ceb190ac5f653ded8a7a2301c91217dc76b88898"} Mar 20 15:04:44 crc kubenswrapper[4764]: I0320 15:04:44.147957 4764 generic.go:334] "Generic (PLEG): container finished" podID="66364f10-f71c-48d8-9691-6b840a589609" containerID="6b390c1efbf2f05fd9482dbdac4e30d1ffdc308d5ad93b8f942934965017167b" exitCode=0 Mar 20 15:04:44 crc kubenswrapper[4764]: I0320 15:04:44.148064 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" event={"ID":"66364f10-f71c-48d8-9691-6b840a589609","Type":"ContainerDied","Data":"6b390c1efbf2f05fd9482dbdac4e30d1ffdc308d5ad93b8f942934965017167b"} Mar 20 15:04:45 crc kubenswrapper[4764]: I0320 15:04:45.158313 4764 generic.go:334] "Generic (PLEG): container finished" podID="66364f10-f71c-48d8-9691-6b840a589609" containerID="aac4f3e4560630b953e2cdc666fba0f908cc36091a69b8017d1b2033398e2bdb" exitCode=0 Mar 20 15:04:45 crc kubenswrapper[4764]: I0320 15:04:45.158407 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" event={"ID":"66364f10-f71c-48d8-9691-6b840a589609","Type":"ContainerDied","Data":"aac4f3e4560630b953e2cdc666fba0f908cc36091a69b8017d1b2033398e2bdb"} Mar 20 15:04:46 crc kubenswrapper[4764]: I0320 15:04:46.482574 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:46 crc kubenswrapper[4764]: I0320 15:04:46.612339 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66364f10-f71c-48d8-9691-6b840a589609-bundle\") pod \"66364f10-f71c-48d8-9691-6b840a589609\" (UID: \"66364f10-f71c-48d8-9691-6b840a589609\") " Mar 20 15:04:46 crc kubenswrapper[4764]: I0320 15:04:46.612478 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frrs5\" (UniqueName: \"kubernetes.io/projected/66364f10-f71c-48d8-9691-6b840a589609-kube-api-access-frrs5\") pod \"66364f10-f71c-48d8-9691-6b840a589609\" (UID: \"66364f10-f71c-48d8-9691-6b840a589609\") " Mar 20 15:04:46 crc kubenswrapper[4764]: I0320 15:04:46.612564 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66364f10-f71c-48d8-9691-6b840a589609-util\") pod \"66364f10-f71c-48d8-9691-6b840a589609\" (UID: \"66364f10-f71c-48d8-9691-6b840a589609\") " Mar 20 15:04:46 crc kubenswrapper[4764]: I0320 15:04:46.613268 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66364f10-f71c-48d8-9691-6b840a589609-bundle" (OuterVolumeSpecName: "bundle") pod "66364f10-f71c-48d8-9691-6b840a589609" (UID: "66364f10-f71c-48d8-9691-6b840a589609"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:04:46 crc kubenswrapper[4764]: I0320 15:04:46.620248 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66364f10-f71c-48d8-9691-6b840a589609-kube-api-access-frrs5" (OuterVolumeSpecName: "kube-api-access-frrs5") pod "66364f10-f71c-48d8-9691-6b840a589609" (UID: "66364f10-f71c-48d8-9691-6b840a589609"). InnerVolumeSpecName "kube-api-access-frrs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:04:46 crc kubenswrapper[4764]: I0320 15:04:46.622511 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66364f10-f71c-48d8-9691-6b840a589609-util" (OuterVolumeSpecName: "util") pod "66364f10-f71c-48d8-9691-6b840a589609" (UID: "66364f10-f71c-48d8-9691-6b840a589609"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:04:46 crc kubenswrapper[4764]: I0320 15:04:46.713990 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66364f10-f71c-48d8-9691-6b840a589609-util\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:46 crc kubenswrapper[4764]: I0320 15:04:46.714043 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66364f10-f71c-48d8-9691-6b840a589609-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:46 crc kubenswrapper[4764]: I0320 15:04:46.714065 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frrs5\" (UniqueName: \"kubernetes.io/projected/66364f10-f71c-48d8-9691-6b840a589609-kube-api-access-frrs5\") on node \"crc\" DevicePath \"\"" Mar 20 15:04:47 crc kubenswrapper[4764]: I0320 15:04:47.174016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" event={"ID":"66364f10-f71c-48d8-9691-6b840a589609","Type":"ContainerDied","Data":"635b9bfc0578a61e951d9bbc987f470e742ba9217e84a85ce3963c747f6a13e7"} Mar 20 15:04:47 crc kubenswrapper[4764]: I0320 15:04:47.174054 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="635b9bfc0578a61e951d9bbc987f470e742ba9217e84a85ce3963c747f6a13e7" Mar 20 15:04:47 crc kubenswrapper[4764]: I0320 15:04:47.174124 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th" Mar 20 15:04:50 crc kubenswrapper[4764]: I0320 15:04:50.832783 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hlh2h"] Mar 20 15:04:50 crc kubenswrapper[4764]: E0320 15:04:50.834186 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66364f10-f71c-48d8-9691-6b840a589609" containerName="util" Mar 20 15:04:50 crc kubenswrapper[4764]: I0320 15:04:50.834315 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="66364f10-f71c-48d8-9691-6b840a589609" containerName="util" Mar 20 15:04:50 crc kubenswrapper[4764]: E0320 15:04:50.834421 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66364f10-f71c-48d8-9691-6b840a589609" containerName="extract" Mar 20 15:04:50 crc kubenswrapper[4764]: I0320 15:04:50.834491 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="66364f10-f71c-48d8-9691-6b840a589609" containerName="extract" Mar 20 15:04:50 crc kubenswrapper[4764]: E0320 15:04:50.834568 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66364f10-f71c-48d8-9691-6b840a589609" containerName="pull" Mar 20 15:04:50 crc kubenswrapper[4764]: I0320 15:04:50.834635 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="66364f10-f71c-48d8-9691-6b840a589609" containerName="pull" Mar 20 15:04:50 crc kubenswrapper[4764]: I0320 15:04:50.834839 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="66364f10-f71c-48d8-9691-6b840a589609" containerName="extract" Mar 20 15:04:50 crc kubenswrapper[4764]: I0320 15:04:50.835441 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hlh2h" Mar 20 15:04:50 crc kubenswrapper[4764]: I0320 15:04:50.837926 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 15:04:50 crc kubenswrapper[4764]: I0320 15:04:50.838838 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-x2zbl" Mar 20 15:04:50 crc kubenswrapper[4764]: I0320 15:04:50.840769 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 15:04:50 crc kubenswrapper[4764]: I0320 15:04:50.853464 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hlh2h"] Mar 20 15:04:50 crc kubenswrapper[4764]: I0320 15:04:50.973785 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ppfm\" (UniqueName: \"kubernetes.io/projected/12847360-40b1-45fc-aa82-6926fe0d9b8a-kube-api-access-7ppfm\") pod \"nmstate-operator-796d4cfff4-hlh2h\" (UID: \"12847360-40b1-45fc-aa82-6926fe0d9b8a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hlh2h" Mar 20 15:04:51 crc kubenswrapper[4764]: I0320 15:04:51.074950 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ppfm\" (UniqueName: \"kubernetes.io/projected/12847360-40b1-45fc-aa82-6926fe0d9b8a-kube-api-access-7ppfm\") pod \"nmstate-operator-796d4cfff4-hlh2h\" (UID: \"12847360-40b1-45fc-aa82-6926fe0d9b8a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hlh2h" Mar 20 15:04:51 crc kubenswrapper[4764]: I0320 15:04:51.107322 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ppfm\" (UniqueName: \"kubernetes.io/projected/12847360-40b1-45fc-aa82-6926fe0d9b8a-kube-api-access-7ppfm\") pod \"nmstate-operator-796d4cfff4-hlh2h\" (UID: \"12847360-40b1-45fc-aa82-6926fe0d9b8a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hlh2h" Mar 20 15:04:51 crc kubenswrapper[4764]: I0320 15:04:51.153120 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hlh2h" Mar 20 15:04:51 crc kubenswrapper[4764]: I0320 15:04:51.394271 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hlh2h"] Mar 20 15:04:51 crc kubenswrapper[4764]: W0320 15:04:51.401104 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12847360_40b1_45fc_aa82_6926fe0d9b8a.slice/crio-e9c139a36f8bf8b284965066a99caaa2ae11a092b3c22d0281ae68609a9f628d WatchSource:0}: Error finding container e9c139a36f8bf8b284965066a99caaa2ae11a092b3c22d0281ae68609a9f628d: Status 404 returned error can't find the container with id e9c139a36f8bf8b284965066a99caaa2ae11a092b3c22d0281ae68609a9f628d Mar 20 15:04:52 crc kubenswrapper[4764]: I0320 15:04:52.212241 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hlh2h" event={"ID":"12847360-40b1-45fc-aa82-6926fe0d9b8a","Type":"ContainerStarted","Data":"e9c139a36f8bf8b284965066a99caaa2ae11a092b3c22d0281ae68609a9f628d"} Mar 20 15:04:55 crc kubenswrapper[4764]: I0320 15:04:55.230005 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hlh2h" event={"ID":"12847360-40b1-45fc-aa82-6926fe0d9b8a","Type":"ContainerStarted","Data":"79b8c94354f77b4b38f66af981ac408c9205e6affcb14129c9f4adb5b9bcddee"} Mar 20 15:04:55 crc kubenswrapper[4764]: I0320 15:04:55.250042 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hlh2h" podStartSLOduration=1.746330379 podStartE2EDuration="5.250024312s" podCreationTimestamp="2026-03-20 15:04:50 +0000 UTC" firstStartedPulling="2026-03-20 15:04:51.403009988 +0000 UTC m=+813.019199137" lastFinishedPulling="2026-03-20 15:04:54.906703951 +0000 UTC m=+816.522893070" observedRunningTime="2026-03-20 15:04:55.248104421 +0000 UTC m=+816.864293550" watchObservedRunningTime="2026-03-20 15:04:55.250024312 +0000 UTC m=+816.866213441" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.579274 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-j6b82"] Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.580319 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j6b82" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.582860 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-948pw" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.601968 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg"] Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.602850 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.604924 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.605673 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-j6b82"] Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.628525 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-vlrv5"] Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.629194 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.647627 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg"] Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.690417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llzqn\" (UniqueName: \"kubernetes.io/projected/d682151d-aba5-445d-89fc-d0e67cf41258-kube-api-access-llzqn\") pod \"nmstate-metrics-9b8c8685d-j6b82\" (UID: \"d682151d-aba5-445d-89fc-d0e67cf41258\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j6b82" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.690485 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b89d9212-3f44-4f7e-8a90-80752f91f8d8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mrkwg\" (UID: \"b89d9212-3f44-4f7e-8a90-80752f91f8d8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.690629 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw2kg\" (UniqueName: \"kubernetes.io/projected/b89d9212-3f44-4f7e-8a90-80752f91f8d8-kube-api-access-cw2kg\") pod \"nmstate-webhook-5f558f5558-mrkwg\" (UID: \"b89d9212-3f44-4f7e-8a90-80752f91f8d8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.714965 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287"] Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.715588 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.717348 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.717516 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.717737 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-gwx29" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.732055 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287"] Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.791498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b89d9212-3f44-4f7e-8a90-80752f91f8d8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mrkwg\" (UID: \"b89d9212-3f44-4f7e-8a90-80752f91f8d8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.791547 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw2kg\" (UniqueName: \"kubernetes.io/projected/b89d9212-3f44-4f7e-8a90-80752f91f8d8-kube-api-access-cw2kg\") pod \"nmstate-webhook-5f558f5558-mrkwg\" (UID: \"b89d9212-3f44-4f7e-8a90-80752f91f8d8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.791575 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6969b81c-842c-4f38-9749-0716f49aff6c-dbus-socket\") pod \"nmstate-handler-vlrv5\" (UID: \"6969b81c-842c-4f38-9749-0716f49aff6c\") " pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.791609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6969b81c-842c-4f38-9749-0716f49aff6c-nmstate-lock\") pod \"nmstate-handler-vlrv5\" (UID: \"6969b81c-842c-4f38-9749-0716f49aff6c\") " pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.791633 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llzqn\" (UniqueName: \"kubernetes.io/projected/d682151d-aba5-445d-89fc-d0e67cf41258-kube-api-access-llzqn\") pod \"nmstate-metrics-9b8c8685d-j6b82\" (UID: \"d682151d-aba5-445d-89fc-d0e67cf41258\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j6b82" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.791650 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6969b81c-842c-4f38-9749-0716f49aff6c-ovs-socket\") pod \"nmstate-handler-vlrv5\" (UID: \"6969b81c-842c-4f38-9749-0716f49aff6c\") " pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.791663 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw4pw\" (UniqueName: \"kubernetes.io/projected/6969b81c-842c-4f38-9749-0716f49aff6c-kube-api-access-xw4pw\") pod \"nmstate-handler-vlrv5\" (UID: \"6969b81c-842c-4f38-9749-0716f49aff6c\") " pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.805999 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b89d9212-3f44-4f7e-8a90-80752f91f8d8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mrkwg\" (UID: \"b89d9212-3f44-4f7e-8a90-80752f91f8d8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.810037 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw2kg\" (UniqueName: \"kubernetes.io/projected/b89d9212-3f44-4f7e-8a90-80752f91f8d8-kube-api-access-cw2kg\") pod \"nmstate-webhook-5f558f5558-mrkwg\" (UID: \"b89d9212-3f44-4f7e-8a90-80752f91f8d8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.815656 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llzqn\" (UniqueName: \"kubernetes.io/projected/d682151d-aba5-445d-89fc-d0e67cf41258-kube-api-access-llzqn\") pod \"nmstate-metrics-9b8c8685d-j6b82\" (UID: \"d682151d-aba5-445d-89fc-d0e67cf41258\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j6b82" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.892425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b7d942-96ce-48ee-a090-ac7dc8fd3e27-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2k287\" (UID: \"f7b7d942-96ce-48ee-a090-ac7dc8fd3e27\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.892783 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6969b81c-842c-4f38-9749-0716f49aff6c-dbus-socket\") pod \"nmstate-handler-vlrv5\" (UID: \"6969b81c-842c-4f38-9749-0716f49aff6c\") " pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.892838 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6969b81c-842c-4f38-9749-0716f49aff6c-nmstate-lock\") pod \"nmstate-handler-vlrv5\" (UID: \"6969b81c-842c-4f38-9749-0716f49aff6c\") " pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.892863 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f7b7d942-96ce-48ee-a090-ac7dc8fd3e27-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2k287\" (UID: \"f7b7d942-96ce-48ee-a090-ac7dc8fd3e27\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.892897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6969b81c-842c-4f38-9749-0716f49aff6c-ovs-socket\") pod \"nmstate-handler-vlrv5\" (UID: \"6969b81c-842c-4f38-9749-0716f49aff6c\") " pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.892920 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw4pw\" (UniqueName: \"kubernetes.io/projected/6969b81c-842c-4f38-9749-0716f49aff6c-kube-api-access-xw4pw\") pod \"nmstate-handler-vlrv5\" (UID: \"6969b81c-842c-4f38-9749-0716f49aff6c\") " pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.892923 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6969b81c-842c-4f38-9749-0716f49aff6c-nmstate-lock\") pod \"nmstate-handler-vlrv5\" (UID: \"6969b81c-842c-4f38-9749-0716f49aff6c\") " pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.892966 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfwnb\" (UniqueName: \"kubernetes.io/projected/f7b7d942-96ce-48ee-a090-ac7dc8fd3e27-kube-api-access-gfwnb\") pod \"nmstate-console-plugin-86f58fcf4-2k287\" (UID: \"f7b7d942-96ce-48ee-a090-ac7dc8fd3e27\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.893024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6969b81c-842c-4f38-9749-0716f49aff6c-ovs-socket\") pod \"nmstate-handler-vlrv5\" (UID: \"6969b81c-842c-4f38-9749-0716f49aff6c\") " pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.893140 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6969b81c-842c-4f38-9749-0716f49aff6c-dbus-socket\") pod \"nmstate-handler-vlrv5\" (UID: \"6969b81c-842c-4f38-9749-0716f49aff6c\") " pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.898108 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j6b82" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.898337 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-85cb976d8d-bqpm2"] Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.899143 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.916276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw4pw\" (UniqueName: \"kubernetes.io/projected/6969b81c-842c-4f38-9749-0716f49aff6c-kube-api-access-xw4pw\") pod \"nmstate-handler-vlrv5\" (UID: \"6969b81c-842c-4f38-9749-0716f49aff6c\") " pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.923290 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.925510 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85cb976d8d-bqpm2"] Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.950793 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:04:59 crc kubenswrapper[4764]: W0320 15:04:59.973901 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6969b81c_842c_4f38_9749_0716f49aff6c.slice/crio-e48db18b0027f37655d5f690ff04eda35e1aaa1c621bd8d069c99381bb0c4c67 WatchSource:0}: Error finding container e48db18b0027f37655d5f690ff04eda35e1aaa1c621bd8d069c99381bb0c4c67: Status 404 returned error can't find the container with id e48db18b0027f37655d5f690ff04eda35e1aaa1c621bd8d069c99381bb0c4c67 Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.997339 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4c25\" (UniqueName: \"kubernetes.io/projected/056e5409-b38c-44bd-a3b0-5e903f01e905-kube-api-access-t4c25\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.997550 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfwnb\" (UniqueName: \"kubernetes.io/projected/f7b7d942-96ce-48ee-a090-ac7dc8fd3e27-kube-api-access-gfwnb\") pod \"nmstate-console-plugin-86f58fcf4-2k287\" (UID: \"f7b7d942-96ce-48ee-a090-ac7dc8fd3e27\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.997606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/056e5409-b38c-44bd-a3b0-5e903f01e905-trusted-ca-bundle\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.997663 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b7d942-96ce-48ee-a090-ac7dc8fd3e27-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2k287\" (UID: \"f7b7d942-96ce-48ee-a090-ac7dc8fd3e27\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.997702 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/056e5409-b38c-44bd-a3b0-5e903f01e905-console-serving-cert\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.997733 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/056e5409-b38c-44bd-a3b0-5e903f01e905-console-config\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.997778 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/056e5409-b38c-44bd-a3b0-5e903f01e905-oauth-serving-cert\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.997815 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/056e5409-b38c-44bd-a3b0-5e903f01e905-console-oauth-config\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.997839 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f7b7d942-96ce-48ee-a090-ac7dc8fd3e27-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2k287\" (UID: \"f7b7d942-96ce-48ee-a090-ac7dc8fd3e27\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.997863 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/056e5409-b38c-44bd-a3b0-5e903f01e905-service-ca\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:04:59 crc kubenswrapper[4764]: I0320 15:04:59.999598 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f7b7d942-96ce-48ee-a090-ac7dc8fd3e27-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2k287\" (UID: \"f7b7d942-96ce-48ee-a090-ac7dc8fd3e27\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.005250 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b7d942-96ce-48ee-a090-ac7dc8fd3e27-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2k287\" (UID: \"f7b7d942-96ce-48ee-a090-ac7dc8fd3e27\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.015077 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfwnb\" (UniqueName: \"kubernetes.io/projected/f7b7d942-96ce-48ee-a090-ac7dc8fd3e27-kube-api-access-gfwnb\") pod \"nmstate-console-plugin-86f58fcf4-2k287\" (UID: \"f7b7d942-96ce-48ee-a090-ac7dc8fd3e27\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.029024 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.099176 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/056e5409-b38c-44bd-a3b0-5e903f01e905-console-serving-cert\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.099222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/056e5409-b38c-44bd-a3b0-5e903f01e905-console-config\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.099245 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/056e5409-b38c-44bd-a3b0-5e903f01e905-oauth-serving-cert\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.099264 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/056e5409-b38c-44bd-a3b0-5e903f01e905-console-oauth-config\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.099287 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/056e5409-b38c-44bd-a3b0-5e903f01e905-service-ca\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.099302 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4c25\" (UniqueName: \"kubernetes.io/projected/056e5409-b38c-44bd-a3b0-5e903f01e905-kube-api-access-t4c25\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.099341 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/056e5409-b38c-44bd-a3b0-5e903f01e905-trusted-ca-bundle\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.100586 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/056e5409-b38c-44bd-a3b0-5e903f01e905-trusted-ca-bundle\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.101995 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/056e5409-b38c-44bd-a3b0-5e903f01e905-service-ca\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.102009 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/056e5409-b38c-44bd-a3b0-5e903f01e905-console-config\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.102182 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/056e5409-b38c-44bd-a3b0-5e903f01e905-oauth-serving-cert\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.103505 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/056e5409-b38c-44bd-a3b0-5e903f01e905-console-serving-cert\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.104915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/056e5409-b38c-44bd-a3b0-5e903f01e905-console-oauth-config\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.116346 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4c25\" (UniqueName: \"kubernetes.io/projected/056e5409-b38c-44bd-a3b0-5e903f01e905-kube-api-access-t4c25\") pod \"console-85cb976d8d-bqpm2\" (UID: \"056e5409-b38c-44bd-a3b0-5e903f01e905\") " pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.159209 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg"] Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.216758 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.217503 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287"] Mar 20 15:05:00 crc kubenswrapper[4764]: W0320 15:05:00.222569 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7b7d942_96ce_48ee_a090_ac7dc8fd3e27.slice/crio-a91498684b9b845a4657e92cb3fc39d24982f8599811cc3efa42d1e464bc8627 WatchSource:0}: Error finding container a91498684b9b845a4657e92cb3fc39d24982f8599811cc3efa42d1e464bc8627: Status 404 returned error can't find the container with id a91498684b9b845a4657e92cb3fc39d24982f8599811cc3efa42d1e464bc8627 Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.267192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" event={"ID":"b89d9212-3f44-4f7e-8a90-80752f91f8d8","Type":"ContainerStarted","Data":"789c68b774d0e5154f8349956a24c83dd4cbe32719eb4b4b89994ff026e15fef"} Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.268315 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vlrv5" event={"ID":"6969b81c-842c-4f38-9749-0716f49aff6c","Type":"ContainerStarted","Data":"e48db18b0027f37655d5f690ff04eda35e1aaa1c621bd8d069c99381bb0c4c67"} Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.269294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" event={"ID":"f7b7d942-96ce-48ee-a090-ac7dc8fd3e27","Type":"ContainerStarted","Data":"a91498684b9b845a4657e92cb3fc39d24982f8599811cc3efa42d1e464bc8627"} Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.322429 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-j6b82"] Mar 20 15:05:00 crc kubenswrapper[4764]: W0320 15:05:00.328215 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd682151d_aba5_445d_89fc_d0e67cf41258.slice/crio-165f520051f0cfbd8d1e196efce2a1bf3e0d7cc76fdafbe5513a787b567d1ad8 WatchSource:0}: Error finding container 165f520051f0cfbd8d1e196efce2a1bf3e0d7cc76fdafbe5513a787b567d1ad8: Status 404 returned error can't find the container with id 165f520051f0cfbd8d1e196efce2a1bf3e0d7cc76fdafbe5513a787b567d1ad8 Mar 20 15:05:00 crc kubenswrapper[4764]: I0320 15:05:00.413445 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85cb976d8d-bqpm2"] Mar 20 15:05:00 crc kubenswrapper[4764]: W0320 15:05:00.414804 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod056e5409_b38c_44bd_a3b0_5e903f01e905.slice/crio-4d0af403496c93d39cb936307ddca3cbd134e4db699770c40b7abddac69007d6 WatchSource:0}: Error finding container 4d0af403496c93d39cb936307ddca3cbd134e4db699770c40b7abddac69007d6: Status 404 returned error can't find the container with id 4d0af403496c93d39cb936307ddca3cbd134e4db699770c40b7abddac69007d6 Mar 20 15:05:01 crc kubenswrapper[4764]: I0320 15:05:01.279368 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j6b82" event={"ID":"d682151d-aba5-445d-89fc-d0e67cf41258","Type":"ContainerStarted","Data":"165f520051f0cfbd8d1e196efce2a1bf3e0d7cc76fdafbe5513a787b567d1ad8"} Mar 20 15:05:01 crc kubenswrapper[4764]: I0320 15:05:01.281924 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85cb976d8d-bqpm2" event={"ID":"056e5409-b38c-44bd-a3b0-5e903f01e905","Type":"ContainerStarted","Data":"17401aaa44eaca5ce405341b3d582bfa26ff1d67d1ce14c355fe107186ac7445"} Mar 20 15:05:01 crc kubenswrapper[4764]: I0320 15:05:01.281974 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85cb976d8d-bqpm2" event={"ID":"056e5409-b38c-44bd-a3b0-5e903f01e905","Type":"ContainerStarted","Data":"4d0af403496c93d39cb936307ddca3cbd134e4db699770c40b7abddac69007d6"} Mar 20 15:05:01 crc kubenswrapper[4764]: I0320 15:05:01.306776 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85cb976d8d-bqpm2" podStartSLOduration=2.306751372 podStartE2EDuration="2.306751372s" podCreationTimestamp="2026-03-20 15:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:05:01.300416975 +0000 UTC m=+822.916606144" watchObservedRunningTime="2026-03-20 15:05:01.306751372 +0000 UTC m=+822.922940531" Mar 20 15:05:03 crc kubenswrapper[4764]: I0320 15:05:03.773117 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:05:04 crc kubenswrapper[4764]: I0320 15:05:04.305391 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vlrv5" event={"ID":"6969b81c-842c-4f38-9749-0716f49aff6c","Type":"ContainerStarted","Data":"bda2d4503b50f7339316c67704f6d9f63b8e23d26d02a8602cd1a893698a12ab"} Mar 20 15:05:04 crc kubenswrapper[4764]: I0320 15:05:04.305903 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:05:04 crc kubenswrapper[4764]: I0320 15:05:04.307780 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" event={"ID":"f7b7d942-96ce-48ee-a090-ac7dc8fd3e27","Type":"ContainerStarted","Data":"00784e8ec736cf0c2b3875d2f8971c3e608267705523cd4f1319f04b50f13fd2"} Mar 20 15:05:04 crc kubenswrapper[4764]: I0320 15:05:04.309232 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" event={"ID":"b89d9212-3f44-4f7e-8a90-80752f91f8d8","Type":"ContainerStarted","Data":"ad27e06b4acf59e7724deb5b4e3c6afb81bec2a60921030f727ea68adb6dcf9f"} Mar 20 15:05:04 crc kubenswrapper[4764]: I0320 15:05:04.309312 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" Mar 20 15:05:04 crc kubenswrapper[4764]: I0320 15:05:04.310741 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j6b82" event={"ID":"d682151d-aba5-445d-89fc-d0e67cf41258","Type":"ContainerStarted","Data":"4fb52b3f87040a0e7c79c5aa9d9c55c29023d26504c8b99c392d5f596efcc5f3"} Mar 20 15:05:04 crc kubenswrapper[4764]: I0320 15:05:04.333844 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-vlrv5" podStartSLOduration=1.733753426 podStartE2EDuration="5.333829338s" podCreationTimestamp="2026-03-20 15:04:59 +0000 UTC" firstStartedPulling="2026-03-20 15:04:59.975906174 +0000 UTC m=+821.592095303" lastFinishedPulling="2026-03-20 15:05:03.575982086 +0000 UTC m=+825.192171215" observedRunningTime="2026-03-20 15:05:04.332706263 +0000 UTC m=+825.948895432" watchObservedRunningTime="2026-03-20 15:05:04.333829338 +0000 UTC m=+825.950018467" Mar 20 15:05:04 crc kubenswrapper[4764]: I0320 15:05:04.361130 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" podStartSLOduration=1.987296686 podStartE2EDuration="5.361110819s" podCreationTimestamp="2026-03-20 15:04:59 +0000 UTC" firstStartedPulling="2026-03-20 15:05:00.168554834 +0000 UTC m=+821.784743953" lastFinishedPulling="2026-03-20 15:05:03.542368957 +0000 UTC m=+825.158558086" observedRunningTime="2026-03-20 15:05:04.355935958 +0000 UTC m=+825.972125167" watchObservedRunningTime="2026-03-20 15:05:04.361110819 +0000 UTC m=+825.977299948" Mar 20 15:05:04 crc kubenswrapper[4764]: I0320 15:05:04.384684 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2k287" podStartSLOduration=2.06532287 podStartE2EDuration="5.384661644s" podCreationTimestamp="2026-03-20 15:04:59 +0000 UTC" firstStartedPulling="2026-03-20 15:05:00.224338244 +0000 UTC m=+821.840527373" lastFinishedPulling="2026-03-20 15:05:03.543676978 +0000 UTC m=+825.159866147" observedRunningTime="2026-03-20 15:05:04.38327282 +0000 UTC m=+825.999461979" watchObservedRunningTime="2026-03-20 15:05:04.384661644 +0000 UTC m=+826.000850813" Mar 20 15:05:07 crc kubenswrapper[4764]: I0320 15:05:07.338794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j6b82" event={"ID":"d682151d-aba5-445d-89fc-d0e67cf41258","Type":"ContainerStarted","Data":"12c994a2cb4943ffaf089570c1a9e6157944d5e01352b6d9ee1ae27aa07de676"} Mar 20 15:05:07 crc kubenswrapper[4764]: I0320 15:05:07.370603 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j6b82" podStartSLOduration=2.456746272 podStartE2EDuration="8.370579135s" podCreationTimestamp="2026-03-20 15:04:59 +0000 UTC" firstStartedPulling="2026-03-20 15:05:00.342531882 +0000 UTC m=+821.958721011" lastFinishedPulling="2026-03-20 15:05:06.256364755 +0000 UTC m=+827.872553874" observedRunningTime="2026-03-20 15:05:07.364150515 +0000 UTC m=+828.980339674" watchObservedRunningTime="2026-03-20 15:05:07.370579135 +0000 UTC m=+828.986768304" Mar 20 15:05:07 crc kubenswrapper[4764]: I0320 15:05:07.737767 4764 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 15:05:08 crc kubenswrapper[4764]: I0320 15:05:08.443765 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:05:08 crc kubenswrapper[4764]: I0320 15:05:08.443839 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:05:08 crc kubenswrapper[4764]: I0320 15:05:08.443892 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 15:05:08 crc kubenswrapper[4764]: I0320 15:05:08.444530 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0a0145de00f1c32456f8ccedac5f6e372476de1dec21fbd6f506f2ab08b9e04"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:05:08 crc kubenswrapper[4764]: I0320 15:05:08.444621 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://d0a0145de00f1c32456f8ccedac5f6e372476de1dec21fbd6f506f2ab08b9e04" gracePeriod=600 Mar 20 15:05:09 crc kubenswrapper[4764]: I0320 15:05:09.357408 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="d0a0145de00f1c32456f8ccedac5f6e372476de1dec21fbd6f506f2ab08b9e04" exitCode=0 Mar 20 15:05:09 crc kubenswrapper[4764]: I0320 15:05:09.357511 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"d0a0145de00f1c32456f8ccedac5f6e372476de1dec21fbd6f506f2ab08b9e04"} Mar 20 15:05:09 crc kubenswrapper[4764]: I0320 15:05:09.358186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"99ce91acea5a3e1ed101da87e85dacfd4e4d5333d6ac9096a602d551d9d17b34"} Mar 20 15:05:09 crc kubenswrapper[4764]: I0320 15:05:09.358281 4764 scope.go:117] "RemoveContainer" containerID="e7f1bbc51003593363e1b74e35ea662eb207292352041beedda252c7cfb9003c" Mar 20 15:05:09 crc kubenswrapper[4764]: I0320 15:05:09.984411 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-vlrv5" Mar 20 15:05:10 crc kubenswrapper[4764]: I0320 15:05:10.217128 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:10 crc kubenswrapper[4764]: I0320 15:05:10.217502 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:10 crc kubenswrapper[4764]: I0320 15:05:10.224951 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:10 crc kubenswrapper[4764]: I0320 15:05:10.376146 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85cb976d8d-bqpm2" Mar 20 15:05:10 crc kubenswrapper[4764]: I0320 15:05:10.426720 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5z97v"] Mar 20 15:05:16 crc kubenswrapper[4764]: I0320 15:05:16.979250 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9kbg6"] Mar 20 15:05:16 crc kubenswrapper[4764]: I0320 15:05:16.981699 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:16 crc kubenswrapper[4764]: I0320 15:05:16.998350 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9kbg6"] Mar 20 15:05:17 crc kubenswrapper[4764]: I0320 15:05:17.073884 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b847b911-0e47-4a38-8bd3-859e4a4fb580-catalog-content\") pod \"certified-operators-9kbg6\" (UID: \"b847b911-0e47-4a38-8bd3-859e4a4fb580\") " pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:17 crc kubenswrapper[4764]: I0320 15:05:17.074707 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p8cb\" (UniqueName: \"kubernetes.io/projected/b847b911-0e47-4a38-8bd3-859e4a4fb580-kube-api-access-6p8cb\") pod \"certified-operators-9kbg6\" (UID: \"b847b911-0e47-4a38-8bd3-859e4a4fb580\") " pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:17 crc kubenswrapper[4764]: I0320 15:05:17.074896 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b847b911-0e47-4a38-8bd3-859e4a4fb580-utilities\") pod \"certified-operators-9kbg6\" (UID: \"b847b911-0e47-4a38-8bd3-859e4a4fb580\") " pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:17 crc kubenswrapper[4764]: I0320 15:05:17.177182 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b847b911-0e47-4a38-8bd3-859e4a4fb580-catalog-content\") pod \"certified-operators-9kbg6\" (UID: \"b847b911-0e47-4a38-8bd3-859e4a4fb580\") " pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:17 crc kubenswrapper[4764]: I0320 15:05:17.177319 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p8cb\" (UniqueName: \"kubernetes.io/projected/b847b911-0e47-4a38-8bd3-859e4a4fb580-kube-api-access-6p8cb\") pod \"certified-operators-9kbg6\" (UID: \"b847b911-0e47-4a38-8bd3-859e4a4fb580\") " pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:17 crc kubenswrapper[4764]: I0320 15:05:17.177369 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b847b911-0e47-4a38-8bd3-859e4a4fb580-utilities\") pod \"certified-operators-9kbg6\" (UID: \"b847b911-0e47-4a38-8bd3-859e4a4fb580\") " pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:17 crc kubenswrapper[4764]: I0320 15:05:17.178661 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b847b911-0e47-4a38-8bd3-859e4a4fb580-catalog-content\") pod \"certified-operators-9kbg6\" (UID: \"b847b911-0e47-4a38-8bd3-859e4a4fb580\") " pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:17 crc kubenswrapper[4764]: I0320 15:05:17.178825 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b847b911-0e47-4a38-8bd3-859e4a4fb580-utilities\") pod \"certified-operators-9kbg6\" (UID: \"b847b911-0e47-4a38-8bd3-859e4a4fb580\") " pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:17 crc kubenswrapper[4764]: I0320 15:05:17.214333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p8cb\" (UniqueName: \"kubernetes.io/projected/b847b911-0e47-4a38-8bd3-859e4a4fb580-kube-api-access-6p8cb\") pod \"certified-operators-9kbg6\" (UID: \"b847b911-0e47-4a38-8bd3-859e4a4fb580\") " pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:17 crc kubenswrapper[4764]: I0320 15:05:17.322030 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:17 crc kubenswrapper[4764]: I0320 15:05:17.835835 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9kbg6"] Mar 20 15:05:17 crc kubenswrapper[4764]: W0320 15:05:17.840171 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb847b911_0e47_4a38_8bd3_859e4a4fb580.slice/crio-69bfc882aa2612f5c00954d413a54a5e6e100a264938379fd633022fc58100f0 WatchSource:0}: Error finding container 69bfc882aa2612f5c00954d413a54a5e6e100a264938379fd633022fc58100f0: Status 404 returned error can't find the container with id 69bfc882aa2612f5c00954d413a54a5e6e100a264938379fd633022fc58100f0 Mar 20 15:05:18 crc kubenswrapper[4764]: I0320 15:05:18.433750 4764 generic.go:334] "Generic (PLEG): container finished" podID="b847b911-0e47-4a38-8bd3-859e4a4fb580" containerID="4d6e0eb3fcc174c96d24df7491320a4a1e812e13795e19968660aaade4ae317c" exitCode=0 Mar 20 15:05:18 crc kubenswrapper[4764]: I0320 15:05:18.433816 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kbg6" event={"ID":"b847b911-0e47-4a38-8bd3-859e4a4fb580","Type":"ContainerDied","Data":"4d6e0eb3fcc174c96d24df7491320a4a1e812e13795e19968660aaade4ae317c"} Mar 20 15:05:18 crc kubenswrapper[4764]: I0320 15:05:18.434026 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kbg6" event={"ID":"b847b911-0e47-4a38-8bd3-859e4a4fb580","Type":"ContainerStarted","Data":"69bfc882aa2612f5c00954d413a54a5e6e100a264938379fd633022fc58100f0"} Mar 20 15:05:19 crc kubenswrapper[4764]: I0320 15:05:19.932521 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mrkwg" Mar 20 15:05:20 crc kubenswrapper[4764]: I0320 15:05:20.451286 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kbg6" event={"ID":"b847b911-0e47-4a38-8bd3-859e4a4fb580","Type":"ContainerStarted","Data":"2bf41eb6dbb6364d8fb2a359d03d4ff10ad2b1cdcaed3e7401a8bff5f89cced5"} Mar 20 15:05:21 crc kubenswrapper[4764]: I0320 15:05:21.461182 4764 generic.go:334] "Generic (PLEG): container finished" podID="b847b911-0e47-4a38-8bd3-859e4a4fb580" containerID="2bf41eb6dbb6364d8fb2a359d03d4ff10ad2b1cdcaed3e7401a8bff5f89cced5" exitCode=0 Mar 20 15:05:21 crc kubenswrapper[4764]: I0320 15:05:21.461232 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kbg6" event={"ID":"b847b911-0e47-4a38-8bd3-859e4a4fb580","Type":"ContainerDied","Data":"2bf41eb6dbb6364d8fb2a359d03d4ff10ad2b1cdcaed3e7401a8bff5f89cced5"} Mar 20 15:05:22 crc kubenswrapper[4764]: I0320 15:05:22.480912 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kbg6" event={"ID":"b847b911-0e47-4a38-8bd3-859e4a4fb580","Type":"ContainerStarted","Data":"e037ba97261171214c39a3fc7d499a09ca9d7dc04ba8eaeade220c768a10b14d"} Mar 20 15:05:22 crc kubenswrapper[4764]: I0320 15:05:22.503342 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9kbg6" podStartSLOduration=2.713535131 podStartE2EDuration="6.50332702s" podCreationTimestamp="2026-03-20 15:05:16 +0000 UTC" firstStartedPulling="2026-03-20 15:05:18.436432237 +0000 UTC m=+840.052621396" lastFinishedPulling="2026-03-20 15:05:22.226224146 +0000 UTC m=+843.842413285" observedRunningTime="2026-03-20 15:05:22.501862604 +0000 UTC m=+844.118051783" watchObservedRunningTime="2026-03-20 15:05:22.50332702 +0000 UTC m=+844.119516159" Mar 20 15:05:27 crc kubenswrapper[4764]: I0320 15:05:27.322900 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:27 crc kubenswrapper[4764]: I0320 15:05:27.323528 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:28 crc kubenswrapper[4764]: I0320 15:05:28.128739 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:28 crc kubenswrapper[4764]: I0320 15:05:28.171116 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:28 crc kubenswrapper[4764]: I0320 15:05:28.793993 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-crxj8"] Mar 20 15:05:28 crc kubenswrapper[4764]: I0320 15:05:28.816089 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-crxj8"] Mar 20 15:05:28 crc kubenswrapper[4764]: I0320 15:05:28.816279 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:28 crc kubenswrapper[4764]: I0320 15:05:28.943341 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x7vc\" (UniqueName: \"kubernetes.io/projected/fff36760-c1d7-45a0-89cd-9a03fe1b113c-kube-api-access-6x7vc\") pod \"community-operators-crxj8\" (UID: \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\") " pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:28 crc kubenswrapper[4764]: I0320 15:05:28.943455 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff36760-c1d7-45a0-89cd-9a03fe1b113c-utilities\") pod \"community-operators-crxj8\" (UID: \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\") " pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:28 crc kubenswrapper[4764]: I0320 15:05:28.943498 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff36760-c1d7-45a0-89cd-9a03fe1b113c-catalog-content\") pod \"community-operators-crxj8\" (UID: \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\") " pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:29 crc kubenswrapper[4764]: I0320 15:05:29.044597 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x7vc\" (UniqueName: \"kubernetes.io/projected/fff36760-c1d7-45a0-89cd-9a03fe1b113c-kube-api-access-6x7vc\") pod \"community-operators-crxj8\" (UID: \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\") " pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:29 crc kubenswrapper[4764]: I0320 15:05:29.044958 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff36760-c1d7-45a0-89cd-9a03fe1b113c-utilities\") pod \"community-operators-crxj8\" (UID: \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\") " pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:29 crc kubenswrapper[4764]: I0320 15:05:29.044989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff36760-c1d7-45a0-89cd-9a03fe1b113c-catalog-content\") pod \"community-operators-crxj8\" (UID: \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\") " pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:29 crc kubenswrapper[4764]: I0320 15:05:29.045623 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff36760-c1d7-45a0-89cd-9a03fe1b113c-utilities\") pod \"community-operators-crxj8\" (UID: \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\") " pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:29 crc kubenswrapper[4764]: I0320 15:05:29.045687 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff36760-c1d7-45a0-89cd-9a03fe1b113c-catalog-content\") pod \"community-operators-crxj8\" (UID: \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\") " pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:29 crc kubenswrapper[4764]: I0320 15:05:29.079764 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x7vc\" (UniqueName: \"kubernetes.io/projected/fff36760-c1d7-45a0-89cd-9a03fe1b113c-kube-api-access-6x7vc\") pod \"community-operators-crxj8\" (UID: \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\") " pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:29 crc kubenswrapper[4764]: I0320 15:05:29.197726 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:29 crc kubenswrapper[4764]: I0320 15:05:29.583518 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-crxj8"] Mar 20 15:05:29 crc kubenswrapper[4764]: W0320 15:05:29.599494 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff36760_c1d7_45a0_89cd_9a03fe1b113c.slice/crio-ddb07db6d82d74463be1dd0aeb79f21371ef648f5fc0ffee20c720ede61ae61f WatchSource:0}: Error finding container ddb07db6d82d74463be1dd0aeb79f21371ef648f5fc0ffee20c720ede61ae61f: Status 404 returned error can't find the container with id ddb07db6d82d74463be1dd0aeb79f21371ef648f5fc0ffee20c720ede61ae61f Mar 20 15:05:30 crc kubenswrapper[4764]: I0320 15:05:30.550220 4764 generic.go:334] "Generic (PLEG): container finished" podID="fff36760-c1d7-45a0-89cd-9a03fe1b113c" containerID="1428ecf97d68c37614b59e769a4632c45dc87b55fbf4bf5fdab4e004f6f2d7f6" exitCode=0 Mar 20 15:05:30 crc kubenswrapper[4764]: I0320 15:05:30.550303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crxj8" event={"ID":"fff36760-c1d7-45a0-89cd-9a03fe1b113c","Type":"ContainerDied","Data":"1428ecf97d68c37614b59e769a4632c45dc87b55fbf4bf5fdab4e004f6f2d7f6"} Mar 20 15:05:30 crc kubenswrapper[4764]: I0320 15:05:30.551058 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crxj8" event={"ID":"fff36760-c1d7-45a0-89cd-9a03fe1b113c","Type":"ContainerStarted","Data":"ddb07db6d82d74463be1dd0aeb79f21371ef648f5fc0ffee20c720ede61ae61f"} Mar 20 15:05:31 crc kubenswrapper[4764]: I0320 15:05:31.569694 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9kbg6"] Mar 20 15:05:31 crc kubenswrapper[4764]: I0320 15:05:31.571219 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9kbg6" podUID="b847b911-0e47-4a38-8bd3-859e4a4fb580" containerName="registry-server" containerID="cri-o://e037ba97261171214c39a3fc7d499a09ca9d7dc04ba8eaeade220c768a10b14d" gracePeriod=2 Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.581034 4764 generic.go:334] "Generic (PLEG): container finished" podID="fff36760-c1d7-45a0-89cd-9a03fe1b113c" containerID="dc4019f237f728461f0383b7887892ebd099b1c0e8507bdd79aeff1c3c4e982c" exitCode=0 Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.581144 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crxj8" event={"ID":"fff36760-c1d7-45a0-89cd-9a03fe1b113c","Type":"ContainerDied","Data":"dc4019f237f728461f0383b7887892ebd099b1c0e8507bdd79aeff1c3c4e982c"} Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.588182 4764 generic.go:334] "Generic (PLEG): container finished" podID="b847b911-0e47-4a38-8bd3-859e4a4fb580" containerID="e037ba97261171214c39a3fc7d499a09ca9d7dc04ba8eaeade220c768a10b14d" exitCode=0 Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.588229 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kbg6" event={"ID":"b847b911-0e47-4a38-8bd3-859e4a4fb580","Type":"ContainerDied","Data":"e037ba97261171214c39a3fc7d499a09ca9d7dc04ba8eaeade220c768a10b14d"} Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.622278 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.737816 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b847b911-0e47-4a38-8bd3-859e4a4fb580-utilities\") pod \"b847b911-0e47-4a38-8bd3-859e4a4fb580\" (UID: \"b847b911-0e47-4a38-8bd3-859e4a4fb580\") " Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.738210 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b847b911-0e47-4a38-8bd3-859e4a4fb580-catalog-content\") pod \"b847b911-0e47-4a38-8bd3-859e4a4fb580\" (UID: \"b847b911-0e47-4a38-8bd3-859e4a4fb580\") " Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.738240 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p8cb\" (UniqueName: \"kubernetes.io/projected/b847b911-0e47-4a38-8bd3-859e4a4fb580-kube-api-access-6p8cb\") pod \"b847b911-0e47-4a38-8bd3-859e4a4fb580\" (UID: \"b847b911-0e47-4a38-8bd3-859e4a4fb580\") " Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.741045 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b847b911-0e47-4a38-8bd3-859e4a4fb580-utilities" (OuterVolumeSpecName: "utilities") pod "b847b911-0e47-4a38-8bd3-859e4a4fb580" (UID: "b847b911-0e47-4a38-8bd3-859e4a4fb580"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.747974 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b847b911-0e47-4a38-8bd3-859e4a4fb580-kube-api-access-6p8cb" (OuterVolumeSpecName: "kube-api-access-6p8cb") pod "b847b911-0e47-4a38-8bd3-859e4a4fb580" (UID: "b847b911-0e47-4a38-8bd3-859e4a4fb580"). InnerVolumeSpecName "kube-api-access-6p8cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.802452 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b847b911-0e47-4a38-8bd3-859e4a4fb580-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b847b911-0e47-4a38-8bd3-859e4a4fb580" (UID: "b847b911-0e47-4a38-8bd3-859e4a4fb580"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.840242 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b847b911-0e47-4a38-8bd3-859e4a4fb580-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.840277 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b847b911-0e47-4a38-8bd3-859e4a4fb580-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:32 crc kubenswrapper[4764]: I0320 15:05:32.840292 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p8cb\" (UniqueName: \"kubernetes.io/projected/b847b911-0e47-4a38-8bd3-859e4a4fb580-kube-api-access-6p8cb\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:33 crc kubenswrapper[4764]: I0320 15:05:33.597601 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kbg6" event={"ID":"b847b911-0e47-4a38-8bd3-859e4a4fb580","Type":"ContainerDied","Data":"69bfc882aa2612f5c00954d413a54a5e6e100a264938379fd633022fc58100f0"} Mar 20 15:05:33 crc kubenswrapper[4764]: I0320 15:05:33.597660 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kbg6" Mar 20 15:05:33 crc kubenswrapper[4764]: I0320 15:05:33.598126 4764 scope.go:117] "RemoveContainer" containerID="e037ba97261171214c39a3fc7d499a09ca9d7dc04ba8eaeade220c768a10b14d" Mar 20 15:05:33 crc kubenswrapper[4764]: I0320 15:05:33.603136 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crxj8" event={"ID":"fff36760-c1d7-45a0-89cd-9a03fe1b113c","Type":"ContainerStarted","Data":"487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5"} Mar 20 15:05:33 crc kubenswrapper[4764]: I0320 15:05:33.626498 4764 scope.go:117] "RemoveContainer" containerID="2bf41eb6dbb6364d8fb2a359d03d4ff10ad2b1cdcaed3e7401a8bff5f89cced5" Mar 20 15:05:33 crc kubenswrapper[4764]: I0320 15:05:33.648597 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-crxj8" podStartSLOduration=3.205130351 podStartE2EDuration="5.64857223s" podCreationTimestamp="2026-03-20 15:05:28 +0000 UTC" firstStartedPulling="2026-03-20 15:05:30.554682358 +0000 UTC m=+852.170871527" lastFinishedPulling="2026-03-20 15:05:32.998124267 +0000 UTC m=+854.614313406" observedRunningTime="2026-03-20 15:05:33.634702855 +0000 UTC m=+855.250892044" watchObservedRunningTime="2026-03-20 15:05:33.64857223 +0000 UTC m=+855.264761369" Mar 20 15:05:33 crc kubenswrapper[4764]: I0320 15:05:33.648954 4764 scope.go:117] "RemoveContainer" containerID="4d6e0eb3fcc174c96d24df7491320a4a1e812e13795e19968660aaade4ae317c" Mar 20 15:05:33 crc kubenswrapper[4764]: I0320 15:05:33.667794 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9kbg6"] Mar 20 15:05:33 crc kubenswrapper[4764]: I0320 15:05:33.672412 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9kbg6"] Mar 20 15:05:34 crc kubenswrapper[4764]: I0320 15:05:34.096769 4764 scope.go:117] "RemoveContainer" containerID="a89ce12233e8c868c31745aaf4798551a59ff9219e07e009b78099d9eaa24730" Mar 20 15:05:35 crc kubenswrapper[4764]: I0320 15:05:35.139670 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b847b911-0e47-4a38-8bd3-859e4a4fb580" path="/var/lib/kubelet/pods/b847b911-0e47-4a38-8bd3-859e4a4fb580/volumes" Mar 20 15:05:35 crc kubenswrapper[4764]: I0320 15:05:35.468512 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-5z97v" podUID="82463101-a3d9-4a1b-a180-aba0318fbeb4" containerName="console" containerID="cri-o://ac9ed294f69a85d38f7b9c53419749819756c5e9de9a056349b71a50ade1dc37" gracePeriod=15 Mar 20 15:05:35 crc kubenswrapper[4764]: I0320 15:05:35.626075 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5z97v_82463101-a3d9-4a1b-a180-aba0318fbeb4/console/0.log" Mar 20 15:05:35 crc kubenswrapper[4764]: I0320 15:05:35.626497 4764 generic.go:334] "Generic (PLEG): container finished" podID="82463101-a3d9-4a1b-a180-aba0318fbeb4" containerID="ac9ed294f69a85d38f7b9c53419749819756c5e9de9a056349b71a50ade1dc37" exitCode=2 Mar 20 15:05:35 crc kubenswrapper[4764]: I0320 15:05:35.626545 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5z97v" event={"ID":"82463101-a3d9-4a1b-a180-aba0318fbeb4","Type":"ContainerDied","Data":"ac9ed294f69a85d38f7b9c53419749819756c5e9de9a056349b71a50ade1dc37"} Mar 20 15:05:35 crc kubenswrapper[4764]: I0320 15:05:35.966149 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5z97v_82463101-a3d9-4a1b-a180-aba0318fbeb4/console/0.log" Mar 20 15:05:35 crc kubenswrapper[4764]: I0320 15:05:35.966267 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5z97v" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.095262 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-serving-cert\") pod \"82463101-a3d9-4a1b-a180-aba0318fbeb4\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.095360 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-oauth-config\") pod \"82463101-a3d9-4a1b-a180-aba0318fbeb4\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.095465 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt45n\" (UniqueName: \"kubernetes.io/projected/82463101-a3d9-4a1b-a180-aba0318fbeb4-kube-api-access-vt45n\") pod \"82463101-a3d9-4a1b-a180-aba0318fbeb4\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.095511 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-service-ca\") pod \"82463101-a3d9-4a1b-a180-aba0318fbeb4\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.095576 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-oauth-serving-cert\") pod \"82463101-a3d9-4a1b-a180-aba0318fbeb4\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.095620 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-config\") pod \"82463101-a3d9-4a1b-a180-aba0318fbeb4\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.095676 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-trusted-ca-bundle\") pod \"82463101-a3d9-4a1b-a180-aba0318fbeb4\" (UID: \"82463101-a3d9-4a1b-a180-aba0318fbeb4\") " Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.096739 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "82463101-a3d9-4a1b-a180-aba0318fbeb4" (UID: "82463101-a3d9-4a1b-a180-aba0318fbeb4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.096756 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-service-ca" (OuterVolumeSpecName: "service-ca") pod "82463101-a3d9-4a1b-a180-aba0318fbeb4" (UID: "82463101-a3d9-4a1b-a180-aba0318fbeb4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.096836 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "82463101-a3d9-4a1b-a180-aba0318fbeb4" (UID: "82463101-a3d9-4a1b-a180-aba0318fbeb4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.096927 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-config" (OuterVolumeSpecName: "console-config") pod "82463101-a3d9-4a1b-a180-aba0318fbeb4" (UID: "82463101-a3d9-4a1b-a180-aba0318fbeb4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.103026 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "82463101-a3d9-4a1b-a180-aba0318fbeb4" (UID: "82463101-a3d9-4a1b-a180-aba0318fbeb4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.103499 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "82463101-a3d9-4a1b-a180-aba0318fbeb4" (UID: "82463101-a3d9-4a1b-a180-aba0318fbeb4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.104131 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82463101-a3d9-4a1b-a180-aba0318fbeb4-kube-api-access-vt45n" (OuterVolumeSpecName: "kube-api-access-vt45n") pod "82463101-a3d9-4a1b-a180-aba0318fbeb4" (UID: "82463101-a3d9-4a1b-a180-aba0318fbeb4"). InnerVolumeSpecName "kube-api-access-vt45n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.196952 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.196998 4764 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.197019 4764 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.197037 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt45n\" (UniqueName: \"kubernetes.io/projected/82463101-a3d9-4a1b-a180-aba0318fbeb4-kube-api-access-vt45n\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.197057 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.197074 4764 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.197090 4764 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82463101-a3d9-4a1b-a180-aba0318fbeb4-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.636762 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5z97v_82463101-a3d9-4a1b-a180-aba0318fbeb4/console/0.log" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.636855 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5z97v" event={"ID":"82463101-a3d9-4a1b-a180-aba0318fbeb4","Type":"ContainerDied","Data":"afe04cae588f16c1ecda6a0e7c85e24fc865724d668c296e048a95a78e42c2e8"} Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.636918 4764 scope.go:117] "RemoveContainer" containerID="ac9ed294f69a85d38f7b9c53419749819756c5e9de9a056349b71a50ade1dc37" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.636953 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5z97v" Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.688859 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5z97v"] Mar 20 15:05:36 crc kubenswrapper[4764]: I0320 15:05:36.696148 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-5z97v"] Mar 20 15:05:37 crc kubenswrapper[4764]: I0320 15:05:37.134199 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82463101-a3d9-4a1b-a180-aba0318fbeb4" path="/var/lib/kubelet/pods/82463101-a3d9-4a1b-a180-aba0318fbeb4/volumes" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.635065 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm"] Mar 20 15:05:38 crc kubenswrapper[4764]: E0320 15:05:38.635950 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82463101-a3d9-4a1b-a180-aba0318fbeb4" containerName="console" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.635985 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="82463101-a3d9-4a1b-a180-aba0318fbeb4" containerName="console" Mar 20 15:05:38 crc kubenswrapper[4764]: E0320 15:05:38.636021 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b847b911-0e47-4a38-8bd3-859e4a4fb580" containerName="extract-content" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.636043 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b847b911-0e47-4a38-8bd3-859e4a4fb580" containerName="extract-content" Mar 20 15:05:38 crc kubenswrapper[4764]: E0320 15:05:38.636067 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b847b911-0e47-4a38-8bd3-859e4a4fb580" containerName="registry-server" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.636084 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b847b911-0e47-4a38-8bd3-859e4a4fb580" containerName="registry-server" Mar 20 15:05:38 crc kubenswrapper[4764]: E0320 15:05:38.636124 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b847b911-0e47-4a38-8bd3-859e4a4fb580" containerName="extract-utilities" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.636142 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b847b911-0e47-4a38-8bd3-859e4a4fb580" containerName="extract-utilities" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.636440 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="82463101-a3d9-4a1b-a180-aba0318fbeb4" containerName="console" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.636486 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b847b911-0e47-4a38-8bd3-859e4a4fb580" containerName="registry-server" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.638322 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.640504 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.648009 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm"] Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.833004 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjdzd\" (UniqueName: \"kubernetes.io/projected/28cc9951-8e26-4c96-8dae-0151015c425b-kube-api-access-tjdzd\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm\" (UID: \"28cc9951-8e26-4c96-8dae-0151015c425b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.833207 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28cc9951-8e26-4c96-8dae-0151015c425b-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm\" (UID: \"28cc9951-8e26-4c96-8dae-0151015c425b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.833302 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28cc9951-8e26-4c96-8dae-0151015c425b-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm\" (UID: \"28cc9951-8e26-4c96-8dae-0151015c425b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.934180 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjdzd\" (UniqueName: \"kubernetes.io/projected/28cc9951-8e26-4c96-8dae-0151015c425b-kube-api-access-tjdzd\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm\" (UID: \"28cc9951-8e26-4c96-8dae-0151015c425b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.934284 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28cc9951-8e26-4c96-8dae-0151015c425b-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm\" (UID: \"28cc9951-8e26-4c96-8dae-0151015c425b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.934322 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28cc9951-8e26-4c96-8dae-0151015c425b-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm\" (UID: \"28cc9951-8e26-4c96-8dae-0151015c425b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.935024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28cc9951-8e26-4c96-8dae-0151015c425b-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm\" (UID: \"28cc9951-8e26-4c96-8dae-0151015c425b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.935317 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28cc9951-8e26-4c96-8dae-0151015c425b-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm\" (UID: \"28cc9951-8e26-4c96-8dae-0151015c425b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.969538 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjdzd\" (UniqueName: \"kubernetes.io/projected/28cc9951-8e26-4c96-8dae-0151015c425b-kube-api-access-tjdzd\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm\" (UID: \"28cc9951-8e26-4c96-8dae-0151015c425b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:38 crc kubenswrapper[4764]: I0320 15:05:38.973567 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:39 crc kubenswrapper[4764]: I0320 15:05:39.199090 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:39 crc kubenswrapper[4764]: I0320 15:05:39.199561 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:39 crc kubenswrapper[4764]: I0320 15:05:39.263574 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:39 crc kubenswrapper[4764]: I0320 15:05:39.463584 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm"] Mar 20 15:05:39 crc kubenswrapper[4764]: W0320 15:05:39.473614 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28cc9951_8e26_4c96_8dae_0151015c425b.slice/crio-b28ec07380f88f631b0ff1356417577aef3444db0a5823a53fc8089dd20cb195 WatchSource:0}: Error finding container b28ec07380f88f631b0ff1356417577aef3444db0a5823a53fc8089dd20cb195: Status 404 returned error can't find the container with id b28ec07380f88f631b0ff1356417577aef3444db0a5823a53fc8089dd20cb195 Mar 20 15:05:39 crc kubenswrapper[4764]: I0320 15:05:39.660296 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" event={"ID":"28cc9951-8e26-4c96-8dae-0151015c425b","Type":"ContainerStarted","Data":"f00e70dfb492d130f17863a871e38e82e4bd4264b200c433ed5c5a818debd783"} Mar 20 15:05:39 crc kubenswrapper[4764]: I0320 15:05:39.660351 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" event={"ID":"28cc9951-8e26-4c96-8dae-0151015c425b","Type":"ContainerStarted","Data":"b28ec07380f88f631b0ff1356417577aef3444db0a5823a53fc8089dd20cb195"} Mar 20 15:05:39 crc kubenswrapper[4764]: I0320 15:05:39.711937 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:40 crc kubenswrapper[4764]: I0320 15:05:40.669924 4764 generic.go:334] "Generic (PLEG): container finished" podID="28cc9951-8e26-4c96-8dae-0151015c425b" containerID="f00e70dfb492d130f17863a871e38e82e4bd4264b200c433ed5c5a818debd783" exitCode=0 Mar 20 15:05:40 crc kubenswrapper[4764]: I0320 15:05:40.670028 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" event={"ID":"28cc9951-8e26-4c96-8dae-0151015c425b","Type":"ContainerDied","Data":"f00e70dfb492d130f17863a871e38e82e4bd4264b200c433ed5c5a818debd783"} Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.176563 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dhbfh"] Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.178043 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.192005 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhbfh"] Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.277274 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66298e5d-985d-471d-a38c-736024fafb8e-catalog-content\") pod \"redhat-marketplace-dhbfh\" (UID: \"66298e5d-985d-471d-a38c-736024fafb8e\") " pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.277325 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66298e5d-985d-471d-a38c-736024fafb8e-utilities\") pod \"redhat-marketplace-dhbfh\" (UID: \"66298e5d-985d-471d-a38c-736024fafb8e\") " pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.277399 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnfrk\" (UniqueName: \"kubernetes.io/projected/66298e5d-985d-471d-a38c-736024fafb8e-kube-api-access-vnfrk\") pod \"redhat-marketplace-dhbfh\" (UID: \"66298e5d-985d-471d-a38c-736024fafb8e\") " pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.378870 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnfrk\" (UniqueName: \"kubernetes.io/projected/66298e5d-985d-471d-a38c-736024fafb8e-kube-api-access-vnfrk\") pod \"redhat-marketplace-dhbfh\" (UID: \"66298e5d-985d-471d-a38c-736024fafb8e\") " pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.378960 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66298e5d-985d-471d-a38c-736024fafb8e-catalog-content\") pod \"redhat-marketplace-dhbfh\" (UID: \"66298e5d-985d-471d-a38c-736024fafb8e\") " pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.378988 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66298e5d-985d-471d-a38c-736024fafb8e-utilities\") pod \"redhat-marketplace-dhbfh\" (UID: \"66298e5d-985d-471d-a38c-736024fafb8e\") " pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.379524 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66298e5d-985d-471d-a38c-736024fafb8e-utilities\") pod \"redhat-marketplace-dhbfh\" (UID: \"66298e5d-985d-471d-a38c-736024fafb8e\") " pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.379743 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66298e5d-985d-471d-a38c-736024fafb8e-catalog-content\") pod \"redhat-marketplace-dhbfh\" (UID: \"66298e5d-985d-471d-a38c-736024fafb8e\") " pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.412923 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnfrk\" (UniqueName: \"kubernetes.io/projected/66298e5d-985d-471d-a38c-736024fafb8e-kube-api-access-vnfrk\") pod \"redhat-marketplace-dhbfh\" (UID: \"66298e5d-985d-471d-a38c-736024fafb8e\") " pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.500530 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:42 crc kubenswrapper[4764]: I0320 15:05:42.791199 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhbfh"] Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.374024 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rjwzb"] Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.378163 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.393806 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rjwzb"] Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.491462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e677413-ea55-4c6b-87a7-535d636e3851-utilities\") pod \"redhat-operators-rjwzb\" (UID: \"1e677413-ea55-4c6b-87a7-535d636e3851\") " pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.491537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2fps\" (UniqueName: \"kubernetes.io/projected/1e677413-ea55-4c6b-87a7-535d636e3851-kube-api-access-p2fps\") pod \"redhat-operators-rjwzb\" (UID: \"1e677413-ea55-4c6b-87a7-535d636e3851\") " pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.491927 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e677413-ea55-4c6b-87a7-535d636e3851-catalog-content\") pod \"redhat-operators-rjwzb\" (UID: \"1e677413-ea55-4c6b-87a7-535d636e3851\") " pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.593620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e677413-ea55-4c6b-87a7-535d636e3851-utilities\") pod \"redhat-operators-rjwzb\" (UID: \"1e677413-ea55-4c6b-87a7-535d636e3851\") " pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.593721 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2fps\" (UniqueName: \"kubernetes.io/projected/1e677413-ea55-4c6b-87a7-535d636e3851-kube-api-access-p2fps\") pod \"redhat-operators-rjwzb\" (UID: \"1e677413-ea55-4c6b-87a7-535d636e3851\") " pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.593826 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e677413-ea55-4c6b-87a7-535d636e3851-catalog-content\") pod \"redhat-operators-rjwzb\" (UID: \"1e677413-ea55-4c6b-87a7-535d636e3851\") " pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.594470 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e677413-ea55-4c6b-87a7-535d636e3851-catalog-content\") pod \"redhat-operators-rjwzb\" (UID: \"1e677413-ea55-4c6b-87a7-535d636e3851\") " pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.594470 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e677413-ea55-4c6b-87a7-535d636e3851-utilities\") pod \"redhat-operators-rjwzb\" (UID: \"1e677413-ea55-4c6b-87a7-535d636e3851\") " pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.635066 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2fps\" (UniqueName: \"kubernetes.io/projected/1e677413-ea55-4c6b-87a7-535d636e3851-kube-api-access-p2fps\") pod \"redhat-operators-rjwzb\" (UID: \"1e677413-ea55-4c6b-87a7-535d636e3851\") " pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.697740 4764 generic.go:334] "Generic (PLEG): container finished" podID="66298e5d-985d-471d-a38c-736024fafb8e" containerID="22e666f85bd0e4878ab556d206ce5119bd7aa6a25086675f5ec4b69ddc684f3b" exitCode=0 Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.697794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhbfh" event={"ID":"66298e5d-985d-471d-a38c-736024fafb8e","Type":"ContainerDied","Data":"22e666f85bd0e4878ab556d206ce5119bd7aa6a25086675f5ec4b69ddc684f3b"} Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.697826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhbfh" event={"ID":"66298e5d-985d-471d-a38c-736024fafb8e","Type":"ContainerStarted","Data":"cb1e6b7cdd7f24428c76335c166f2ef1c3add4e913adc21a17a70e22d7d8fc5f"} Mar 20 15:05:43 crc kubenswrapper[4764]: I0320 15:05:43.706690 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:44 crc kubenswrapper[4764]: I0320 15:05:44.193919 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rjwzb"] Mar 20 15:05:45 crc kubenswrapper[4764]: W0320 15:05:45.697711 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e677413_ea55_4c6b_87a7_535d636e3851.slice/crio-ed59d57f0908e7df7ffffa445ba62b052742a787c5378bdac5fb46f519702dc9 WatchSource:0}: Error finding container ed59d57f0908e7df7ffffa445ba62b052742a787c5378bdac5fb46f519702dc9: Status 404 returned error can't find the container with id ed59d57f0908e7df7ffffa445ba62b052742a787c5378bdac5fb46f519702dc9 Mar 20 15:05:45 crc kubenswrapper[4764]: I0320 15:05:45.712175 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjwzb" event={"ID":"1e677413-ea55-4c6b-87a7-535d636e3851","Type":"ContainerStarted","Data":"ed59d57f0908e7df7ffffa445ba62b052742a787c5378bdac5fb46f519702dc9"} Mar 20 15:05:46 crc kubenswrapper[4764]: I0320 15:05:46.722160 4764 generic.go:334] "Generic (PLEG): container finished" podID="1e677413-ea55-4c6b-87a7-535d636e3851" containerID="ac971a6123af106838daa54795cb32f16e1eba91ea2e4a320941128ef2f948e1" exitCode=0 Mar 20 15:05:46 crc kubenswrapper[4764]: I0320 15:05:46.722308 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjwzb" event={"ID":"1e677413-ea55-4c6b-87a7-535d636e3851","Type":"ContainerDied","Data":"ac971a6123af106838daa54795cb32f16e1eba91ea2e4a320941128ef2f948e1"} Mar 20 15:05:46 crc kubenswrapper[4764]: I0320 15:05:46.731294 4764 generic.go:334] "Generic (PLEG): container finished" podID="28cc9951-8e26-4c96-8dae-0151015c425b" containerID="e2b916f1768b69eccea0c78102f65cdffd5b52e0d67d4ee563cb76e2e3a953fa" exitCode=0 Mar 20 15:05:46 crc kubenswrapper[4764]: I0320 15:05:46.731425 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" event={"ID":"28cc9951-8e26-4c96-8dae-0151015c425b","Type":"ContainerDied","Data":"e2b916f1768b69eccea0c78102f65cdffd5b52e0d67d4ee563cb76e2e3a953fa"} Mar 20 15:05:46 crc kubenswrapper[4764]: I0320 15:05:46.734418 4764 generic.go:334] "Generic (PLEG): container finished" podID="66298e5d-985d-471d-a38c-736024fafb8e" containerID="3c15fec01d8e1b3a2fd958e648e522cc84e35853035258257e4c41a92ac7f57e" exitCode=0 Mar 20 15:05:46 crc kubenswrapper[4764]: I0320 15:05:46.734487 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhbfh" event={"ID":"66298e5d-985d-471d-a38c-736024fafb8e","Type":"ContainerDied","Data":"3c15fec01d8e1b3a2fd958e648e522cc84e35853035258257e4c41a92ac7f57e"} Mar 20 15:05:46 crc kubenswrapper[4764]: I0320 15:05:46.964286 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-crxj8"] Mar 20 15:05:46 crc kubenswrapper[4764]: I0320 15:05:46.964986 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-crxj8" podUID="fff36760-c1d7-45a0-89cd-9a03fe1b113c" containerName="registry-server" containerID="cri-o://487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5" gracePeriod=2 Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.419339 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.547094 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff36760-c1d7-45a0-89cd-9a03fe1b113c-utilities\") pod \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\" (UID: \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\") " Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.547529 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff36760-c1d7-45a0-89cd-9a03fe1b113c-catalog-content\") pod \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\" (UID: \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\") " Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.547599 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x7vc\" (UniqueName: \"kubernetes.io/projected/fff36760-c1d7-45a0-89cd-9a03fe1b113c-kube-api-access-6x7vc\") pod \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\" (UID: \"fff36760-c1d7-45a0-89cd-9a03fe1b113c\") " Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.548913 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff36760-c1d7-45a0-89cd-9a03fe1b113c-utilities" (OuterVolumeSpecName: "utilities") pod "fff36760-c1d7-45a0-89cd-9a03fe1b113c" (UID: "fff36760-c1d7-45a0-89cd-9a03fe1b113c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.556873 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff36760-c1d7-45a0-89cd-9a03fe1b113c-kube-api-access-6x7vc" (OuterVolumeSpecName: "kube-api-access-6x7vc") pod "fff36760-c1d7-45a0-89cd-9a03fe1b113c" (UID: "fff36760-c1d7-45a0-89cd-9a03fe1b113c"). InnerVolumeSpecName "kube-api-access-6x7vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.649992 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x7vc\" (UniqueName: \"kubernetes.io/projected/fff36760-c1d7-45a0-89cd-9a03fe1b113c-kube-api-access-6x7vc\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.650292 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff36760-c1d7-45a0-89cd-9a03fe1b113c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.650044 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff36760-c1d7-45a0-89cd-9a03fe1b113c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fff36760-c1d7-45a0-89cd-9a03fe1b113c" (UID: "fff36760-c1d7-45a0-89cd-9a03fe1b113c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.743448 4764 generic.go:334] "Generic (PLEG): container finished" podID="fff36760-c1d7-45a0-89cd-9a03fe1b113c" containerID="487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5" exitCode=0 Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.743495 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crxj8" event={"ID":"fff36760-c1d7-45a0-89cd-9a03fe1b113c","Type":"ContainerDied","Data":"487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5"} Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.743528 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-crxj8" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.744729 4764 scope.go:117] "RemoveContainer" containerID="487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.744709 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-crxj8" event={"ID":"fff36760-c1d7-45a0-89cd-9a03fe1b113c","Type":"ContainerDied","Data":"ddb07db6d82d74463be1dd0aeb79f21371ef648f5fc0ffee20c720ede61ae61f"} Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.751299 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff36760-c1d7-45a0-89cd-9a03fe1b113c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.755606 4764 generic.go:334] "Generic (PLEG): container finished" podID="28cc9951-8e26-4c96-8dae-0151015c425b" containerID="707beeae9f4711ef8b4f05c7d38a5133f8e9a3911db1f292ca88d5486a29c607" exitCode=0 Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.755666 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" event={"ID":"28cc9951-8e26-4c96-8dae-0151015c425b","Type":"ContainerDied","Data":"707beeae9f4711ef8b4f05c7d38a5133f8e9a3911db1f292ca88d5486a29c607"} Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.768871 4764 scope.go:117] "RemoveContainer" containerID="dc4019f237f728461f0383b7887892ebd099b1c0e8507bdd79aeff1c3c4e982c" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.796937 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-crxj8"] Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.805500 4764 scope.go:117] "RemoveContainer" containerID="1428ecf97d68c37614b59e769a4632c45dc87b55fbf4bf5fdab4e004f6f2d7f6" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.805820 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-crxj8"] Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.820032 4764 scope.go:117] "RemoveContainer" containerID="487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5" Mar 20 15:05:47 crc kubenswrapper[4764]: E0320 15:05:47.820533 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5\": container with ID starting with 487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5 not found: ID does not exist" containerID="487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.820579 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5"} err="failed to get container status \"487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5\": rpc error: code = NotFound desc = could not find container \"487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5\": container with ID starting with 487fa2bdcdb301f187c09e561439526d678022fd245c5d73840bfb694b0470d5 not found: ID does not exist" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.820610 4764 scope.go:117] "RemoveContainer" containerID="dc4019f237f728461f0383b7887892ebd099b1c0e8507bdd79aeff1c3c4e982c" Mar 20 15:05:47 crc kubenswrapper[4764]: E0320 15:05:47.820923 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc4019f237f728461f0383b7887892ebd099b1c0e8507bdd79aeff1c3c4e982c\": container with ID starting with dc4019f237f728461f0383b7887892ebd099b1c0e8507bdd79aeff1c3c4e982c not found: ID does not exist" containerID="dc4019f237f728461f0383b7887892ebd099b1c0e8507bdd79aeff1c3c4e982c" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.820961 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc4019f237f728461f0383b7887892ebd099b1c0e8507bdd79aeff1c3c4e982c"} err="failed to get container status \"dc4019f237f728461f0383b7887892ebd099b1c0e8507bdd79aeff1c3c4e982c\": rpc error: code = NotFound desc = could not find container \"dc4019f237f728461f0383b7887892ebd099b1c0e8507bdd79aeff1c3c4e982c\": container with ID starting with dc4019f237f728461f0383b7887892ebd099b1c0e8507bdd79aeff1c3c4e982c not found: ID does not exist" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.820989 4764 scope.go:117] "RemoveContainer" containerID="1428ecf97d68c37614b59e769a4632c45dc87b55fbf4bf5fdab4e004f6f2d7f6" Mar 20 15:05:47 crc kubenswrapper[4764]: E0320 15:05:47.821358 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1428ecf97d68c37614b59e769a4632c45dc87b55fbf4bf5fdab4e004f6f2d7f6\": container with ID starting with 1428ecf97d68c37614b59e769a4632c45dc87b55fbf4bf5fdab4e004f6f2d7f6 not found: ID does not exist" containerID="1428ecf97d68c37614b59e769a4632c45dc87b55fbf4bf5fdab4e004f6f2d7f6" Mar 20 15:05:47 crc kubenswrapper[4764]: I0320 15:05:47.821410 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1428ecf97d68c37614b59e769a4632c45dc87b55fbf4bf5fdab4e004f6f2d7f6"} err="failed to get container status \"1428ecf97d68c37614b59e769a4632c45dc87b55fbf4bf5fdab4e004f6f2d7f6\": rpc error: code = NotFound desc = could not find container \"1428ecf97d68c37614b59e769a4632c45dc87b55fbf4bf5fdab4e004f6f2d7f6\": container with ID starting with 1428ecf97d68c37614b59e769a4632c45dc87b55fbf4bf5fdab4e004f6f2d7f6 not found: ID does not exist" Mar 20 15:05:48 crc kubenswrapper[4764]: I0320 15:05:48.764610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhbfh" event={"ID":"66298e5d-985d-471d-a38c-736024fafb8e","Type":"ContainerStarted","Data":"15c7b7681f15c3567a87d8cecd5ef65cd183cfba45e3eac5925b056aafb94d22"} Mar 20 15:05:48 crc kubenswrapper[4764]: I0320 15:05:48.798866 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dhbfh" podStartSLOduration=2.3059685500000002 podStartE2EDuration="6.798840523s" podCreationTimestamp="2026-03-20 15:05:42 +0000 UTC" firstStartedPulling="2026-03-20 15:05:43.716950098 +0000 UTC m=+865.333139247" lastFinishedPulling="2026-03-20 15:05:48.209822051 +0000 UTC m=+869.826011220" observedRunningTime="2026-03-20 15:05:48.793317255 +0000 UTC m=+870.409506444" watchObservedRunningTime="2026-03-20 15:05:48.798840523 +0000 UTC m=+870.415029682" Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.065633 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.133958 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff36760-c1d7-45a0-89cd-9a03fe1b113c" path="/var/lib/kubelet/pods/fff36760-c1d7-45a0-89cd-9a03fe1b113c/volumes" Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.174500 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28cc9951-8e26-4c96-8dae-0151015c425b-bundle\") pod \"28cc9951-8e26-4c96-8dae-0151015c425b\" (UID: \"28cc9951-8e26-4c96-8dae-0151015c425b\") " Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.174623 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjdzd\" (UniqueName: \"kubernetes.io/projected/28cc9951-8e26-4c96-8dae-0151015c425b-kube-api-access-tjdzd\") pod \"28cc9951-8e26-4c96-8dae-0151015c425b\" (UID: \"28cc9951-8e26-4c96-8dae-0151015c425b\") " Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.174693 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28cc9951-8e26-4c96-8dae-0151015c425b-util\") pod \"28cc9951-8e26-4c96-8dae-0151015c425b\" (UID: \"28cc9951-8e26-4c96-8dae-0151015c425b\") " Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.175589 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cc9951-8e26-4c96-8dae-0151015c425b-bundle" (OuterVolumeSpecName: "bundle") pod "28cc9951-8e26-4c96-8dae-0151015c425b" (UID: "28cc9951-8e26-4c96-8dae-0151015c425b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.179838 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28cc9951-8e26-4c96-8dae-0151015c425b-kube-api-access-tjdzd" (OuterVolumeSpecName: "kube-api-access-tjdzd") pod "28cc9951-8e26-4c96-8dae-0151015c425b" (UID: "28cc9951-8e26-4c96-8dae-0151015c425b"). InnerVolumeSpecName "kube-api-access-tjdzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.198300 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28cc9951-8e26-4c96-8dae-0151015c425b-util" (OuterVolumeSpecName: "util") pod "28cc9951-8e26-4c96-8dae-0151015c425b" (UID: "28cc9951-8e26-4c96-8dae-0151015c425b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.276493 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjdzd\" (UniqueName: \"kubernetes.io/projected/28cc9951-8e26-4c96-8dae-0151015c425b-kube-api-access-tjdzd\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.276524 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/28cc9951-8e26-4c96-8dae-0151015c425b-util\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.276533 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/28cc9951-8e26-4c96-8dae-0151015c425b-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.774294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjwzb" event={"ID":"1e677413-ea55-4c6b-87a7-535d636e3851","Type":"ContainerStarted","Data":"cdf4760fe48cd64ae729af8cf43b67462299241baa0f2553157213fc06137a9a"} Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.777055 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" event={"ID":"28cc9951-8e26-4c96-8dae-0151015c425b","Type":"ContainerDied","Data":"b28ec07380f88f631b0ff1356417577aef3444db0a5823a53fc8089dd20cb195"} Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.777091 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm" Mar 20 15:05:49 crc kubenswrapper[4764]: I0320 15:05:49.777098 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b28ec07380f88f631b0ff1356417577aef3444db0a5823a53fc8089dd20cb195" Mar 20 15:05:50 crc kubenswrapper[4764]: I0320 15:05:50.788235 4764 generic.go:334] "Generic (PLEG): container finished" podID="1e677413-ea55-4c6b-87a7-535d636e3851" containerID="cdf4760fe48cd64ae729af8cf43b67462299241baa0f2553157213fc06137a9a" exitCode=0 Mar 20 15:05:50 crc kubenswrapper[4764]: I0320 15:05:50.788345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjwzb" event={"ID":"1e677413-ea55-4c6b-87a7-535d636e3851","Type":"ContainerDied","Data":"cdf4760fe48cd64ae729af8cf43b67462299241baa0f2553157213fc06137a9a"} Mar 20 15:05:52 crc kubenswrapper[4764]: I0320 15:05:52.500828 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:52 crc kubenswrapper[4764]: I0320 15:05:52.501120 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:52 crc kubenswrapper[4764]: I0320 15:05:52.536963 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:05:52 crc kubenswrapper[4764]: I0320 15:05:52.800927 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjwzb" event={"ID":"1e677413-ea55-4c6b-87a7-535d636e3851","Type":"ContainerStarted","Data":"d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe"} Mar 20 15:05:52 crc kubenswrapper[4764]: I0320 15:05:52.820307 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rjwzb" podStartSLOduration=4.896516131 podStartE2EDuration="9.820291425s" podCreationTimestamp="2026-03-20 15:05:43 +0000 UTC" firstStartedPulling="2026-03-20 15:05:46.726121074 +0000 UTC m=+868.342310233" lastFinishedPulling="2026-03-20 15:05:51.649896388 +0000 UTC m=+873.266085527" observedRunningTime="2026-03-20 15:05:52.817650211 +0000 UTC m=+874.433839340" watchObservedRunningTime="2026-03-20 15:05:52.820291425 +0000 UTC m=+874.436480554" Mar 20 15:05:53 crc kubenswrapper[4764]: I0320 15:05:53.707966 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:53 crc kubenswrapper[4764]: I0320 15:05:53.708324 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:05:54 crc kubenswrapper[4764]: I0320 15:05:54.760232 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rjwzb" podUID="1e677413-ea55-4c6b-87a7-535d636e3851" containerName="registry-server" probeResult="failure" output=< Mar 20 15:05:54 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 20 15:05:54 crc kubenswrapper[4764]: > Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.918894 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr"] Mar 20 15:05:56 crc kubenswrapper[4764]: E0320 15:05:56.919705 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cc9951-8e26-4c96-8dae-0151015c425b" containerName="extract" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.919719 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cc9951-8e26-4c96-8dae-0151015c425b" containerName="extract" Mar 20 15:05:56 crc kubenswrapper[4764]: E0320 15:05:56.919730 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff36760-c1d7-45a0-89cd-9a03fe1b113c" containerName="registry-server" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.919736 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff36760-c1d7-45a0-89cd-9a03fe1b113c" containerName="registry-server" Mar 20 15:05:56 crc kubenswrapper[4764]: E0320 15:05:56.919743 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cc9951-8e26-4c96-8dae-0151015c425b" containerName="pull" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.919748 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cc9951-8e26-4c96-8dae-0151015c425b" containerName="pull" Mar 20 15:05:56 crc kubenswrapper[4764]: E0320 15:05:56.919758 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff36760-c1d7-45a0-89cd-9a03fe1b113c" containerName="extract-utilities" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.919764 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff36760-c1d7-45a0-89cd-9a03fe1b113c" containerName="extract-utilities" Mar 20 15:05:56 crc kubenswrapper[4764]: E0320 15:05:56.919776 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cc9951-8e26-4c96-8dae-0151015c425b" containerName="util" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.919782 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cc9951-8e26-4c96-8dae-0151015c425b" containerName="util" Mar 20 15:05:56 crc kubenswrapper[4764]: E0320 15:05:56.919791 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff36760-c1d7-45a0-89cd-9a03fe1b113c" containerName="extract-content" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.919797 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff36760-c1d7-45a0-89cd-9a03fe1b113c" containerName="extract-content" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.919883 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff36760-c1d7-45a0-89cd-9a03fe1b113c" containerName="registry-server" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.919896 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="28cc9951-8e26-4c96-8dae-0151015c425b" containerName="extract" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.920367 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.923356 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.923496 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.923947 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-hwvvn" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.924401 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.926933 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.940136 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr"] Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.998906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a6607dc-2637-4ccd-85ee-63db60070729-apiservice-cert\") pod \"metallb-operator-controller-manager-f7bc86596-htwbr\" (UID: \"2a6607dc-2637-4ccd-85ee-63db60070729\") " pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.999010 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a6607dc-2637-4ccd-85ee-63db60070729-webhook-cert\") pod \"metallb-operator-controller-manager-f7bc86596-htwbr\" (UID: \"2a6607dc-2637-4ccd-85ee-63db60070729\") " pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:05:56 crc kubenswrapper[4764]: I0320 15:05:56.999038 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbqzq\" (UniqueName: \"kubernetes.io/projected/2a6607dc-2637-4ccd-85ee-63db60070729-kube-api-access-dbqzq\") pod \"metallb-operator-controller-manager-f7bc86596-htwbr\" (UID: \"2a6607dc-2637-4ccd-85ee-63db60070729\") " pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.100174 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a6607dc-2637-4ccd-85ee-63db60070729-webhook-cert\") pod \"metallb-operator-controller-manager-f7bc86596-htwbr\" (UID: \"2a6607dc-2637-4ccd-85ee-63db60070729\") " pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.100227 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbqzq\" (UniqueName: \"kubernetes.io/projected/2a6607dc-2637-4ccd-85ee-63db60070729-kube-api-access-dbqzq\") pod \"metallb-operator-controller-manager-f7bc86596-htwbr\" (UID: \"2a6607dc-2637-4ccd-85ee-63db60070729\") " pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.100262 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a6607dc-2637-4ccd-85ee-63db60070729-apiservice-cert\") pod \"metallb-operator-controller-manager-f7bc86596-htwbr\" (UID: \"2a6607dc-2637-4ccd-85ee-63db60070729\") " pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.107075 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a6607dc-2637-4ccd-85ee-63db60070729-apiservice-cert\") pod \"metallb-operator-controller-manager-f7bc86596-htwbr\" (UID: \"2a6607dc-2637-4ccd-85ee-63db60070729\") " pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.112032 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a6607dc-2637-4ccd-85ee-63db60070729-webhook-cert\") pod \"metallb-operator-controller-manager-f7bc86596-htwbr\" (UID: \"2a6607dc-2637-4ccd-85ee-63db60070729\") " pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.123630 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbqzq\" (UniqueName: \"kubernetes.io/projected/2a6607dc-2637-4ccd-85ee-63db60070729-kube-api-access-dbqzq\") pod \"metallb-operator-controller-manager-f7bc86596-htwbr\" (UID: \"2a6607dc-2637-4ccd-85ee-63db60070729\") " pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.235705 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.239607 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9"] Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.240315 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.254746 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.254933 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.258618 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-jvbrg" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.259490 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9"] Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.403297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4tf9\" (UniqueName: \"kubernetes.io/projected/fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7-kube-api-access-f4tf9\") pod \"metallb-operator-webhook-server-5766dfb6c-m75b9\" (UID: \"fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7\") " pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.403602 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7-webhook-cert\") pod \"metallb-operator-webhook-server-5766dfb6c-m75b9\" (UID: \"fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7\") " pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.403623 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7-apiservice-cert\") pod \"metallb-operator-webhook-server-5766dfb6c-m75b9\" (UID: \"fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7\") " pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.504597 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4tf9\" (UniqueName: \"kubernetes.io/projected/fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7-kube-api-access-f4tf9\") pod \"metallb-operator-webhook-server-5766dfb6c-m75b9\" (UID: \"fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7\") " pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.504678 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7-webhook-cert\") pod \"metallb-operator-webhook-server-5766dfb6c-m75b9\" (UID: \"fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7\") " pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.504701 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7-apiservice-cert\") pod \"metallb-operator-webhook-server-5766dfb6c-m75b9\" (UID: \"fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7\") " pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.511224 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7-apiservice-cert\") pod \"metallb-operator-webhook-server-5766dfb6c-m75b9\" (UID: \"fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7\") " pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.511972 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7-webhook-cert\") pod \"metallb-operator-webhook-server-5766dfb6c-m75b9\" (UID: \"fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7\") " pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.520993 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4tf9\" (UniqueName: \"kubernetes.io/projected/fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7-kube-api-access-f4tf9\") pod \"metallb-operator-webhook-server-5766dfb6c-m75b9\" (UID: \"fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7\") " pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.591855 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.751732 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr"] Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.838745 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" event={"ID":"2a6607dc-2637-4ccd-85ee-63db60070729","Type":"ContainerStarted","Data":"e9feafef427966458bd5a80c9ef4bb1e75f46903d5f00620d3fe0e5dc6255937"} Mar 20 15:05:57 crc kubenswrapper[4764]: I0320 15:05:57.855406 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9"] Mar 20 15:05:57 crc kubenswrapper[4764]: W0320 15:05:57.859871 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcf9844d_cd3b_400c_b7c2_ce3bc92f59e7.slice/crio-a685fec9ea981998291f7520b02c12299d69bded2980c50e2d5c9bec1eb19c7b WatchSource:0}: Error finding container a685fec9ea981998291f7520b02c12299d69bded2980c50e2d5c9bec1eb19c7b: Status 404 returned error can't find the container with id a685fec9ea981998291f7520b02c12299d69bded2980c50e2d5c9bec1eb19c7b Mar 20 15:05:58 crc kubenswrapper[4764]: I0320 15:05:58.846913 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" event={"ID":"fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7","Type":"ContainerStarted","Data":"a685fec9ea981998291f7520b02c12299d69bded2980c50e2d5c9bec1eb19c7b"} Mar 20 15:06:00 crc kubenswrapper[4764]: I0320 15:06:00.128089 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566986-jgn2s"] Mar 20 15:06:00 crc kubenswrapper[4764]: I0320 15:06:00.129055 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566986-jgn2s" Mar 20 15:06:00 crc kubenswrapper[4764]: I0320 15:06:00.131948 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:06:00 crc kubenswrapper[4764]: I0320 15:06:00.132580 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:06:00 crc kubenswrapper[4764]: I0320 15:06:00.132752 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:06:00 crc kubenswrapper[4764]: I0320 15:06:00.137757 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566986-jgn2s"] Mar 20 15:06:00 crc kubenswrapper[4764]: I0320 15:06:00.247900 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj7w5\" (UniqueName: \"kubernetes.io/projected/73bc41bc-cd24-486e-bca8-1bb7a329b304-kube-api-access-kj7w5\") pod \"auto-csr-approver-29566986-jgn2s\" (UID: \"73bc41bc-cd24-486e-bca8-1bb7a329b304\") " pod="openshift-infra/auto-csr-approver-29566986-jgn2s" Mar 20 15:06:00 crc kubenswrapper[4764]: I0320 15:06:00.348749 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj7w5\" (UniqueName: \"kubernetes.io/projected/73bc41bc-cd24-486e-bca8-1bb7a329b304-kube-api-access-kj7w5\") pod \"auto-csr-approver-29566986-jgn2s\" (UID: \"73bc41bc-cd24-486e-bca8-1bb7a329b304\") " pod="openshift-infra/auto-csr-approver-29566986-jgn2s" Mar 20 15:06:00 crc kubenswrapper[4764]: I0320 15:06:00.372659 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj7w5\" (UniqueName: \"kubernetes.io/projected/73bc41bc-cd24-486e-bca8-1bb7a329b304-kube-api-access-kj7w5\") pod \"auto-csr-approver-29566986-jgn2s\" (UID: \"73bc41bc-cd24-486e-bca8-1bb7a329b304\") " pod="openshift-infra/auto-csr-approver-29566986-jgn2s" Mar 20 15:06:00 crc kubenswrapper[4764]: I0320 15:06:00.453680 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566986-jgn2s" Mar 20 15:06:01 crc kubenswrapper[4764]: I0320 15:06:01.892049 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566986-jgn2s"] Mar 20 15:06:02 crc kubenswrapper[4764]: I0320 15:06:02.571989 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:06:02 crc kubenswrapper[4764]: I0320 15:06:02.881217 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" event={"ID":"2a6607dc-2637-4ccd-85ee-63db60070729","Type":"ContainerStarted","Data":"d75ea1bfe1badd136845e8983cb27beda668b80dd981f8af4004c66770f4b727"} Mar 20 15:06:02 crc kubenswrapper[4764]: I0320 15:06:02.881392 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:06:02 crc kubenswrapper[4764]: I0320 15:06:02.882179 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566986-jgn2s" event={"ID":"73bc41bc-cd24-486e-bca8-1bb7a329b304","Type":"ContainerStarted","Data":"051b6be43f48ee658a0fb28c8bfab802d17fa9deb9f76df2a0f6503cf0464ac4"} Mar 20 15:06:02 crc kubenswrapper[4764]: I0320 15:06:02.904500 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" podStartSLOduration=2.4730915700000002 podStartE2EDuration="6.904482661s" podCreationTimestamp="2026-03-20 15:05:56 +0000 UTC" firstStartedPulling="2026-03-20 15:05:57.765623759 +0000 UTC m=+879.381812888" lastFinishedPulling="2026-03-20 15:06:02.19701485 +0000 UTC m=+883.813203979" observedRunningTime="2026-03-20 15:06:02.898634574 +0000 UTC m=+884.514823703" watchObservedRunningTime="2026-03-20 15:06:02.904482661 +0000 UTC m=+884.520671790" Mar 20 15:06:03 crc kubenswrapper[4764]: I0320 15:06:03.805890 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:06:03 crc kubenswrapper[4764]: I0320 15:06:03.843911 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:06:06 crc kubenswrapper[4764]: I0320 15:06:06.560825 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhbfh"] Mar 20 15:06:06 crc kubenswrapper[4764]: I0320 15:06:06.561621 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dhbfh" podUID="66298e5d-985d-471d-a38c-736024fafb8e" containerName="registry-server" containerID="cri-o://15c7b7681f15c3567a87d8cecd5ef65cd183cfba45e3eac5925b056aafb94d22" gracePeriod=2 Mar 20 15:06:06 crc kubenswrapper[4764]: I0320 15:06:06.907436 4764 generic.go:334] "Generic (PLEG): container finished" podID="66298e5d-985d-471d-a38c-736024fafb8e" containerID="15c7b7681f15c3567a87d8cecd5ef65cd183cfba45e3eac5925b056aafb94d22" exitCode=0 Mar 20 15:06:06 crc kubenswrapper[4764]: I0320 15:06:06.907504 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhbfh" event={"ID":"66298e5d-985d-471d-a38c-736024fafb8e","Type":"ContainerDied","Data":"15c7b7681f15c3567a87d8cecd5ef65cd183cfba45e3eac5925b056aafb94d22"} Mar 20 15:06:06 crc kubenswrapper[4764]: I0320 15:06:06.909463 4764 generic.go:334] "Generic (PLEG): container finished" podID="73bc41bc-cd24-486e-bca8-1bb7a329b304" containerID="7a0aa33dbad07904ee592cf9e7e4bdb22f38f75749baff3de3ef3968975691e5" exitCode=0 Mar 20 15:06:06 crc kubenswrapper[4764]: I0320 15:06:06.909555 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566986-jgn2s" event={"ID":"73bc41bc-cd24-486e-bca8-1bb7a329b304","Type":"ContainerDied","Data":"7a0aa33dbad07904ee592cf9e7e4bdb22f38f75749baff3de3ef3968975691e5"} Mar 20 15:06:06 crc kubenswrapper[4764]: I0320 15:06:06.911517 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" event={"ID":"fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7","Type":"ContainerStarted","Data":"d18a7cf4470e247ec98819e5ff7ca64aa29f694fee641396571e20bac0afadfe"} Mar 20 15:06:06 crc kubenswrapper[4764]: I0320 15:06:06.911703 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:06:06 crc kubenswrapper[4764]: I0320 15:06:06.963701 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" podStartSLOduration=2.450022734 podStartE2EDuration="9.963675954s" podCreationTimestamp="2026-03-20 15:05:57 +0000 UTC" firstStartedPulling="2026-03-20 15:05:57.863167928 +0000 UTC m=+879.479357057" lastFinishedPulling="2026-03-20 15:06:05.376821138 +0000 UTC m=+886.993010277" observedRunningTime="2026-03-20 15:06:06.953928011 +0000 UTC m=+888.570117150" watchObservedRunningTime="2026-03-20 15:06:06.963675954 +0000 UTC m=+888.579865123" Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.471984 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.641203 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66298e5d-985d-471d-a38c-736024fafb8e-utilities\") pod \"66298e5d-985d-471d-a38c-736024fafb8e\" (UID: \"66298e5d-985d-471d-a38c-736024fafb8e\") " Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.641280 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnfrk\" (UniqueName: \"kubernetes.io/projected/66298e5d-985d-471d-a38c-736024fafb8e-kube-api-access-vnfrk\") pod \"66298e5d-985d-471d-a38c-736024fafb8e\" (UID: \"66298e5d-985d-471d-a38c-736024fafb8e\") " Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.641306 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66298e5d-985d-471d-a38c-736024fafb8e-catalog-content\") pod \"66298e5d-985d-471d-a38c-736024fafb8e\" (UID: \"66298e5d-985d-471d-a38c-736024fafb8e\") " Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.642326 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66298e5d-985d-471d-a38c-736024fafb8e-utilities" (OuterVolumeSpecName: "utilities") pod "66298e5d-985d-471d-a38c-736024fafb8e" (UID: "66298e5d-985d-471d-a38c-736024fafb8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.649640 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66298e5d-985d-471d-a38c-736024fafb8e-kube-api-access-vnfrk" (OuterVolumeSpecName: "kube-api-access-vnfrk") pod "66298e5d-985d-471d-a38c-736024fafb8e" (UID: "66298e5d-985d-471d-a38c-736024fafb8e"). InnerVolumeSpecName "kube-api-access-vnfrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.668280 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66298e5d-985d-471d-a38c-736024fafb8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66298e5d-985d-471d-a38c-736024fafb8e" (UID: "66298e5d-985d-471d-a38c-736024fafb8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.743012 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66298e5d-985d-471d-a38c-736024fafb8e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.743054 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnfrk\" (UniqueName: \"kubernetes.io/projected/66298e5d-985d-471d-a38c-736024fafb8e-kube-api-access-vnfrk\") on node \"crc\" DevicePath \"\"" Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.743070 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66298e5d-985d-471d-a38c-736024fafb8e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.918047 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dhbfh" event={"ID":"66298e5d-985d-471d-a38c-736024fafb8e","Type":"ContainerDied","Data":"cb1e6b7cdd7f24428c76335c166f2ef1c3add4e913adc21a17a70e22d7d8fc5f"} Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.918121 4764 scope.go:117] "RemoveContainer" containerID="15c7b7681f15c3567a87d8cecd5ef65cd183cfba45e3eac5925b056aafb94d22" Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.918075 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dhbfh" Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.946550 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhbfh"] Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.951367 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dhbfh"] Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.962396 4764 scope.go:117] "RemoveContainer" containerID="3c15fec01d8e1b3a2fd958e648e522cc84e35853035258257e4c41a92ac7f57e" Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.963705 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rjwzb"] Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.963958 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rjwzb" podUID="1e677413-ea55-4c6b-87a7-535d636e3851" containerName="registry-server" containerID="cri-o://d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe" gracePeriod=2 Mar 20 15:06:07 crc kubenswrapper[4764]: I0320 15:06:07.995986 4764 scope.go:117] "RemoveContainer" containerID="22e666f85bd0e4878ab556d206ce5119bd7aa6a25086675f5ec4b69ddc684f3b" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.300887 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566986-jgn2s" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.354779 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj7w5\" (UniqueName: \"kubernetes.io/projected/73bc41bc-cd24-486e-bca8-1bb7a329b304-kube-api-access-kj7w5\") pod \"73bc41bc-cd24-486e-bca8-1bb7a329b304\" (UID: \"73bc41bc-cd24-486e-bca8-1bb7a329b304\") " Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.359374 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73bc41bc-cd24-486e-bca8-1bb7a329b304-kube-api-access-kj7w5" (OuterVolumeSpecName: "kube-api-access-kj7w5") pod "73bc41bc-cd24-486e-bca8-1bb7a329b304" (UID: "73bc41bc-cd24-486e-bca8-1bb7a329b304"). InnerVolumeSpecName "kube-api-access-kj7w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.456115 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj7w5\" (UniqueName: \"kubernetes.io/projected/73bc41bc-cd24-486e-bca8-1bb7a329b304-kube-api-access-kj7w5\") on node \"crc\" DevicePath \"\"" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.861768 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.925839 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566986-jgn2s" event={"ID":"73bc41bc-cd24-486e-bca8-1bb7a329b304","Type":"ContainerDied","Data":"051b6be43f48ee658a0fb28c8bfab802d17fa9deb9f76df2a0f6503cf0464ac4"} Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.925881 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566986-jgn2s" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.925889 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="051b6be43f48ee658a0fb28c8bfab802d17fa9deb9f76df2a0f6503cf0464ac4" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.928975 4764 generic.go:334] "Generic (PLEG): container finished" podID="1e677413-ea55-4c6b-87a7-535d636e3851" containerID="d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe" exitCode=0 Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.929019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjwzb" event={"ID":"1e677413-ea55-4c6b-87a7-535d636e3851","Type":"ContainerDied","Data":"d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe"} Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.929036 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rjwzb" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.929070 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rjwzb" event={"ID":"1e677413-ea55-4c6b-87a7-535d636e3851","Type":"ContainerDied","Data":"ed59d57f0908e7df7ffffa445ba62b052742a787c5378bdac5fb46f519702dc9"} Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.929091 4764 scope.go:117] "RemoveContainer" containerID="d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.943446 4764 scope.go:117] "RemoveContainer" containerID="cdf4760fe48cd64ae729af8cf43b67462299241baa0f2553157213fc06137a9a" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.968902 4764 scope.go:117] "RemoveContainer" containerID="ac971a6123af106838daa54795cb32f16e1eba91ea2e4a320941128ef2f948e1" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.995755 4764 scope.go:117] "RemoveContainer" containerID="d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe" Mar 20 15:06:08 crc kubenswrapper[4764]: E0320 15:06:08.997886 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe\": container with ID starting with d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe not found: ID does not exist" containerID="d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.997989 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe"} err="failed to get container status \"d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe\": rpc error: code = NotFound desc = could not find container \"d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe\": container with ID starting with d69978e58809f4ca7c6e475140f63b970bf42f375b5ed601715dd9b4eb089cfe not found: ID does not exist" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.998015 4764 scope.go:117] "RemoveContainer" containerID="cdf4760fe48cd64ae729af8cf43b67462299241baa0f2553157213fc06137a9a" Mar 20 15:06:08 crc kubenswrapper[4764]: E0320 15:06:08.998425 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdf4760fe48cd64ae729af8cf43b67462299241baa0f2553157213fc06137a9a\": container with ID starting with cdf4760fe48cd64ae729af8cf43b67462299241baa0f2553157213fc06137a9a not found: ID does not exist" containerID="cdf4760fe48cd64ae729af8cf43b67462299241baa0f2553157213fc06137a9a" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.998455 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf4760fe48cd64ae729af8cf43b67462299241baa0f2553157213fc06137a9a"} err="failed to get container status \"cdf4760fe48cd64ae729af8cf43b67462299241baa0f2553157213fc06137a9a\": rpc error: code = NotFound desc = could not find container \"cdf4760fe48cd64ae729af8cf43b67462299241baa0f2553157213fc06137a9a\": container with ID starting with cdf4760fe48cd64ae729af8cf43b67462299241baa0f2553157213fc06137a9a not found: ID does not exist" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.998474 4764 scope.go:117] "RemoveContainer" containerID="ac971a6123af106838daa54795cb32f16e1eba91ea2e4a320941128ef2f948e1" Mar 20 15:06:08 crc kubenswrapper[4764]: E0320 15:06:08.999201 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac971a6123af106838daa54795cb32f16e1eba91ea2e4a320941128ef2f948e1\": container with ID starting with ac971a6123af106838daa54795cb32f16e1eba91ea2e4a320941128ef2f948e1 not found: ID does not exist" containerID="ac971a6123af106838daa54795cb32f16e1eba91ea2e4a320941128ef2f948e1" Mar 20 15:06:08 crc kubenswrapper[4764]: I0320 15:06:08.999242 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac971a6123af106838daa54795cb32f16e1eba91ea2e4a320941128ef2f948e1"} err="failed to get container status \"ac971a6123af106838daa54795cb32f16e1eba91ea2e4a320941128ef2f948e1\": rpc error: code = NotFound desc = could not find container \"ac971a6123af106838daa54795cb32f16e1eba91ea2e4a320941128ef2f948e1\": container with ID starting with ac971a6123af106838daa54795cb32f16e1eba91ea2e4a320941128ef2f948e1 not found: ID does not exist" Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.061336 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2fps\" (UniqueName: \"kubernetes.io/projected/1e677413-ea55-4c6b-87a7-535d636e3851-kube-api-access-p2fps\") pod \"1e677413-ea55-4c6b-87a7-535d636e3851\" (UID: \"1e677413-ea55-4c6b-87a7-535d636e3851\") " Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.062216 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e677413-ea55-4c6b-87a7-535d636e3851-utilities" (OuterVolumeSpecName: "utilities") pod "1e677413-ea55-4c6b-87a7-535d636e3851" (UID: "1e677413-ea55-4c6b-87a7-535d636e3851"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.062691 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e677413-ea55-4c6b-87a7-535d636e3851-utilities\") pod \"1e677413-ea55-4c6b-87a7-535d636e3851\" (UID: \"1e677413-ea55-4c6b-87a7-535d636e3851\") " Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.062751 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e677413-ea55-4c6b-87a7-535d636e3851-catalog-content\") pod \"1e677413-ea55-4c6b-87a7-535d636e3851\" (UID: \"1e677413-ea55-4c6b-87a7-535d636e3851\") " Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.062925 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e677413-ea55-4c6b-87a7-535d636e3851-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.075611 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e677413-ea55-4c6b-87a7-535d636e3851-kube-api-access-p2fps" (OuterVolumeSpecName: "kube-api-access-p2fps") pod "1e677413-ea55-4c6b-87a7-535d636e3851" (UID: "1e677413-ea55-4c6b-87a7-535d636e3851"). InnerVolumeSpecName "kube-api-access-p2fps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.134703 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66298e5d-985d-471d-a38c-736024fafb8e" path="/var/lib/kubelet/pods/66298e5d-985d-471d-a38c-736024fafb8e/volumes" Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.164058 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2fps\" (UniqueName: \"kubernetes.io/projected/1e677413-ea55-4c6b-87a7-535d636e3851-kube-api-access-p2fps\") on node \"crc\" DevicePath \"\"" Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.194620 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e677413-ea55-4c6b-87a7-535d636e3851-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e677413-ea55-4c6b-87a7-535d636e3851" (UID: "1e677413-ea55-4c6b-87a7-535d636e3851"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.258322 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rjwzb"] Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.265485 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rjwzb"] Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.266393 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e677413-ea55-4c6b-87a7-535d636e3851-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.359813 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566980-sxrs8"] Mar 20 15:06:09 crc kubenswrapper[4764]: I0320 15:06:09.362974 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566980-sxrs8"] Mar 20 15:06:11 crc kubenswrapper[4764]: I0320 15:06:11.139364 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e677413-ea55-4c6b-87a7-535d636e3851" path="/var/lib/kubelet/pods/1e677413-ea55-4c6b-87a7-535d636e3851/volumes" Mar 20 15:06:11 crc kubenswrapper[4764]: I0320 15:06:11.140313 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a080aa-2f66-45e8-adb4-66cc7e2359e3" path="/var/lib/kubelet/pods/65a080aa-2f66-45e8-adb4-66cc7e2359e3/volumes" Mar 20 15:06:17 crc kubenswrapper[4764]: I0320 15:06:17.596655 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5766dfb6c-m75b9" Mar 20 15:06:34 crc kubenswrapper[4764]: I0320 15:06:34.186548 4764 scope.go:117] "RemoveContainer" containerID="4953a8cd06c5cbd019a63cd17a6ffe752dfa0afe70106c7866d6506464f30880" Mar 20 15:06:37 crc kubenswrapper[4764]: I0320 15:06:37.240036 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-f7bc86596-htwbr" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.090581 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf"] Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.091097 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66298e5d-985d-471d-a38c-736024fafb8e" containerName="extract-utilities" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.091185 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="66298e5d-985d-471d-a38c-736024fafb8e" containerName="extract-utilities" Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.091262 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66298e5d-985d-471d-a38c-736024fafb8e" containerName="registry-server" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.091339 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="66298e5d-985d-471d-a38c-736024fafb8e" containerName="registry-server" Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.091456 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e677413-ea55-4c6b-87a7-535d636e3851" containerName="extract-content" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.091536 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e677413-ea55-4c6b-87a7-535d636e3851" containerName="extract-content" Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.091607 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e677413-ea55-4c6b-87a7-535d636e3851" containerName="registry-server" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.091669 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e677413-ea55-4c6b-87a7-535d636e3851" containerName="registry-server" Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.091756 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66298e5d-985d-471d-a38c-736024fafb8e" containerName="extract-content" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.091820 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="66298e5d-985d-471d-a38c-736024fafb8e" containerName="extract-content" Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.091896 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e677413-ea55-4c6b-87a7-535d636e3851" containerName="extract-utilities" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.091963 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e677413-ea55-4c6b-87a7-535d636e3851" containerName="extract-utilities" Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.092040 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73bc41bc-cd24-486e-bca8-1bb7a329b304" containerName="oc" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.092118 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="73bc41bc-cd24-486e-bca8-1bb7a329b304" containerName="oc" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.092309 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="66298e5d-985d-471d-a38c-736024fafb8e" containerName="registry-server" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.092420 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="73bc41bc-cd24-486e-bca8-1bb7a329b304" containerName="oc" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.092510 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e677413-ea55-4c6b-87a7-535d636e3851" containerName="registry-server" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.093117 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.099159 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.099516 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-r9q5r"] Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.102363 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.102663 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-xckcf" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.106916 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.107001 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.108766 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf"] Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.158101 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/797c71ba-f218-4ebc-9109-c1a8e36a1f75-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-zwrnf\" (UID: \"797c71ba-f218-4ebc-9109-c1a8e36a1f75\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.158167 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn2j7\" (UniqueName: \"kubernetes.io/projected/797c71ba-f218-4ebc-9109-c1a8e36a1f75-kube-api-access-mn2j7\") pod \"frr-k8s-webhook-server-bcc4b6f68-zwrnf\" (UID: \"797c71ba-f218-4ebc-9109-c1a8e36a1f75\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.158219 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3767b150-6238-4386-9155-4e198e0ee2d2-frr-conf\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.158255 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3767b150-6238-4386-9155-4e198e0ee2d2-metrics-certs\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.158321 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3767b150-6238-4386-9155-4e198e0ee2d2-metrics\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.158344 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3767b150-6238-4386-9155-4e198e0ee2d2-frr-sockets\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.158397 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3767b150-6238-4386-9155-4e198e0ee2d2-reloader\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.158445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wchk\" (UniqueName: \"kubernetes.io/projected/3767b150-6238-4386-9155-4e198e0ee2d2-kube-api-access-9wchk\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.158505 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3767b150-6238-4386-9155-4e198e0ee2d2-frr-startup\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.226307 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-bbvxz"] Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.227107 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:38 crc kubenswrapper[4764]: W0320 15:06:38.228581 4764 reflector.go:561] object-"metallb-system"/"controller-certs-secret": failed to list *v1.Secret: secrets "controller-certs-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.228685 4764 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"controller-certs-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"controller-certs-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.245491 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-vc5vs"] Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.246811 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.248744 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.249125 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.249139 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-lsj2z" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.249146 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.259287 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wchk\" (UniqueName: \"kubernetes.io/projected/3767b150-6238-4386-9155-4e198e0ee2d2-kube-api-access-9wchk\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.259328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-memberlist\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.259350 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbdf4541-5578-4f99-b7fb-bc3be4cd939a-metrics-certs\") pod \"controller-7bb4cc7c98-bbvxz\" (UID: \"bbdf4541-5578-4f99-b7fb-bc3be4cd939a\") " pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.259389 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3767b150-6238-4386-9155-4e198e0ee2d2-frr-startup\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.259667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/797c71ba-f218-4ebc-9109-c1a8e36a1f75-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-zwrnf\" (UID: \"797c71ba-f218-4ebc-9109-c1a8e36a1f75\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.259724 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-metrics-certs\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.259750 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g9tw\" (UniqueName: \"kubernetes.io/projected/bbdf4541-5578-4f99-b7fb-bc3be4cd939a-kube-api-access-8g9tw\") pod \"controller-7bb4cc7c98-bbvxz\" (UID: \"bbdf4541-5578-4f99-b7fb-bc3be4cd939a\") " pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.259780 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn2j7\" (UniqueName: \"kubernetes.io/projected/797c71ba-f218-4ebc-9109-c1a8e36a1f75-kube-api-access-mn2j7\") pod \"frr-k8s-webhook-server-bcc4b6f68-zwrnf\" (UID: \"797c71ba-f218-4ebc-9109-c1a8e36a1f75\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.259827 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3767b150-6238-4386-9155-4e198e0ee2d2-frr-conf\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.259852 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/343e987b-163e-410c-a81d-83f3abb76064-metallb-excludel2\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.259989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3767b150-6238-4386-9155-4e198e0ee2d2-metrics-certs\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.260271 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3767b150-6238-4386-9155-4e198e0ee2d2-metrics\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.260312 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3767b150-6238-4386-9155-4e198e0ee2d2-frr-sockets\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.260356 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l2wb\" (UniqueName: \"kubernetes.io/projected/343e987b-163e-410c-a81d-83f3abb76064-kube-api-access-5l2wb\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.260395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3767b150-6238-4386-9155-4e198e0ee2d2-reloader\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.260412 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bbdf4541-5578-4f99-b7fb-bc3be4cd939a-cert\") pod \"controller-7bb4cc7c98-bbvxz\" (UID: \"bbdf4541-5578-4f99-b7fb-bc3be4cd939a\") " pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.260749 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3767b150-6238-4386-9155-4e198e0ee2d2-reloader\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.260829 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3767b150-6238-4386-9155-4e198e0ee2d2-frr-sockets\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.260914 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3767b150-6238-4386-9155-4e198e0ee2d2-frr-startup\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.260942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3767b150-6238-4386-9155-4e198e0ee2d2-metrics\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.261194 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3767b150-6238-4386-9155-4e198e0ee2d2-frr-conf\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.270170 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3767b150-6238-4386-9155-4e198e0ee2d2-metrics-certs\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.275373 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-bbvxz"] Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.281107 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/797c71ba-f218-4ebc-9109-c1a8e36a1f75-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-zwrnf\" (UID: \"797c71ba-f218-4ebc-9109-c1a8e36a1f75\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.297207 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wchk\" (UniqueName: \"kubernetes.io/projected/3767b150-6238-4386-9155-4e198e0ee2d2-kube-api-access-9wchk\") pod \"frr-k8s-r9q5r\" (UID: \"3767b150-6238-4386-9155-4e198e0ee2d2\") " pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.309096 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn2j7\" (UniqueName: \"kubernetes.io/projected/797c71ba-f218-4ebc-9109-c1a8e36a1f75-kube-api-access-mn2j7\") pod \"frr-k8s-webhook-server-bcc4b6f68-zwrnf\" (UID: \"797c71ba-f218-4ebc-9109-c1a8e36a1f75\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.361563 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l2wb\" (UniqueName: \"kubernetes.io/projected/343e987b-163e-410c-a81d-83f3abb76064-kube-api-access-5l2wb\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.361619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bbdf4541-5578-4f99-b7fb-bc3be4cd939a-cert\") pod \"controller-7bb4cc7c98-bbvxz\" (UID: \"bbdf4541-5578-4f99-b7fb-bc3be4cd939a\") " pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.361645 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-memberlist\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.361661 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbdf4541-5578-4f99-b7fb-bc3be4cd939a-metrics-certs\") pod \"controller-7bb4cc7c98-bbvxz\" (UID: \"bbdf4541-5578-4f99-b7fb-bc3be4cd939a\") " pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.361706 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-metrics-certs\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.361725 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g9tw\" (UniqueName: \"kubernetes.io/projected/bbdf4541-5578-4f99-b7fb-bc3be4cd939a-kube-api-access-8g9tw\") pod \"controller-7bb4cc7c98-bbvxz\" (UID: \"bbdf4541-5578-4f99-b7fb-bc3be4cd939a\") " pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.361761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/343e987b-163e-410c-a81d-83f3abb76064-metallb-excludel2\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.361836 4764 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.361846 4764 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.361911 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-memberlist podName:343e987b-163e-410c-a81d-83f3abb76064 nodeName:}" failed. No retries permitted until 2026-03-20 15:06:38.86189252 +0000 UTC m=+920.478081649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-memberlist") pod "speaker-vc5vs" (UID: "343e987b-163e-410c-a81d-83f3abb76064") : secret "metallb-memberlist" not found Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.361969 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-metrics-certs podName:343e987b-163e-410c-a81d-83f3abb76064 nodeName:}" failed. No retries permitted until 2026-03-20 15:06:38.861951272 +0000 UTC m=+920.478140401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-metrics-certs") pod "speaker-vc5vs" (UID: "343e987b-163e-410c-a81d-83f3abb76064") : secret "speaker-certs-secret" not found Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.362404 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/343e987b-163e-410c-a81d-83f3abb76064-metallb-excludel2\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.363127 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.376837 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bbdf4541-5578-4f99-b7fb-bc3be4cd939a-cert\") pod \"controller-7bb4cc7c98-bbvxz\" (UID: \"bbdf4541-5578-4f99-b7fb-bc3be4cd939a\") " pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.381176 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g9tw\" (UniqueName: \"kubernetes.io/projected/bbdf4541-5578-4f99-b7fb-bc3be4cd939a-kube-api-access-8g9tw\") pod \"controller-7bb4cc7c98-bbvxz\" (UID: \"bbdf4541-5578-4f99-b7fb-bc3be4cd939a\") " pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.384599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l2wb\" (UniqueName: \"kubernetes.io/projected/343e987b-163e-410c-a81d-83f3abb76064-kube-api-access-5l2wb\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.419096 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.432567 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.813051 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf"] Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.869084 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-memberlist\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.869213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-metrics-certs\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.869241 4764 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 15:06:38 crc kubenswrapper[4764]: E0320 15:06:38.869326 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-memberlist podName:343e987b-163e-410c-a81d-83f3abb76064 nodeName:}" failed. No retries permitted until 2026-03-20 15:06:39.869305012 +0000 UTC m=+921.485494151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-memberlist") pod "speaker-vc5vs" (UID: "343e987b-163e-410c-a81d-83f3abb76064") : secret "metallb-memberlist" not found Mar 20 15:06:38 crc kubenswrapper[4764]: I0320 15:06:38.881120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-metrics-certs\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:39 crc kubenswrapper[4764]: I0320 15:06:39.147191 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" event={"ID":"797c71ba-f218-4ebc-9109-c1a8e36a1f75","Type":"ContainerStarted","Data":"5c7128b42b1ac681459d3ba1d17cf633a623e45c980ad054ed4c04782e903b25"} Mar 20 15:06:39 crc kubenswrapper[4764]: I0320 15:06:39.150242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r9q5r" event={"ID":"3767b150-6238-4386-9155-4e198e0ee2d2","Type":"ContainerStarted","Data":"23ac6d4289cac2e95137f2897914dafdb1b94a1684311105cc2ea4978cc80ce8"} Mar 20 15:06:39 crc kubenswrapper[4764]: E0320 15:06:39.362255 4764 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: failed to sync secret cache: timed out waiting for the condition Mar 20 15:06:39 crc kubenswrapper[4764]: E0320 15:06:39.362411 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbdf4541-5578-4f99-b7fb-bc3be4cd939a-metrics-certs podName:bbdf4541-5578-4f99-b7fb-bc3be4cd939a nodeName:}" failed. No retries permitted until 2026-03-20 15:06:39.86236047 +0000 UTC m=+921.478549639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bbdf4541-5578-4f99-b7fb-bc3be4cd939a-metrics-certs") pod "controller-7bb4cc7c98-bbvxz" (UID: "bbdf4541-5578-4f99-b7fb-bc3be4cd939a") : failed to sync secret cache: timed out waiting for the condition Mar 20 15:06:39 crc kubenswrapper[4764]: I0320 15:06:39.704767 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 15:06:39 crc kubenswrapper[4764]: I0320 15:06:39.889255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-memberlist\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:39 crc kubenswrapper[4764]: I0320 15:06:39.889757 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbdf4541-5578-4f99-b7fb-bc3be4cd939a-metrics-certs\") pod \"controller-7bb4cc7c98-bbvxz\" (UID: \"bbdf4541-5578-4f99-b7fb-bc3be4cd939a\") " pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:39 crc kubenswrapper[4764]: E0320 15:06:39.889479 4764 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 15:06:39 crc kubenswrapper[4764]: E0320 15:06:39.889896 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-memberlist podName:343e987b-163e-410c-a81d-83f3abb76064 nodeName:}" failed. No retries permitted until 2026-03-20 15:06:41.88987304 +0000 UTC m=+923.506062249 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-memberlist") pod "speaker-vc5vs" (UID: "343e987b-163e-410c-a81d-83f3abb76064") : secret "metallb-memberlist" not found Mar 20 15:06:39 crc kubenswrapper[4764]: I0320 15:06:39.905197 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bbdf4541-5578-4f99-b7fb-bc3be4cd939a-metrics-certs\") pod \"controller-7bb4cc7c98-bbvxz\" (UID: \"bbdf4541-5578-4f99-b7fb-bc3be4cd939a\") " pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:40 crc kubenswrapper[4764]: I0320 15:06:40.040922 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:40 crc kubenswrapper[4764]: I0320 15:06:40.385647 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-bbvxz"] Mar 20 15:06:40 crc kubenswrapper[4764]: W0320 15:06:40.414233 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbdf4541_5578_4f99_b7fb_bc3be4cd939a.slice/crio-b6f8e7fb6f9df3eadb1a236796458e2653c86e0e09ec7fe7a51b28baed03412f WatchSource:0}: Error finding container b6f8e7fb6f9df3eadb1a236796458e2653c86e0e09ec7fe7a51b28baed03412f: Status 404 returned error can't find the container with id b6f8e7fb6f9df3eadb1a236796458e2653c86e0e09ec7fe7a51b28baed03412f Mar 20 15:06:41 crc kubenswrapper[4764]: I0320 15:06:41.170562 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-bbvxz" event={"ID":"bbdf4541-5578-4f99-b7fb-bc3be4cd939a","Type":"ContainerStarted","Data":"6e7dff309a958149a0838c0782f3ee9092b9b162dd60e999d1bb66afd4e08db7"} Mar 20 15:06:41 crc kubenswrapper[4764]: I0320 15:06:41.170819 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:41 crc kubenswrapper[4764]: I0320 15:06:41.170830 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-bbvxz" event={"ID":"bbdf4541-5578-4f99-b7fb-bc3be4cd939a","Type":"ContainerStarted","Data":"ff137eee469f1c34624532ade4cb0961b81fe133794d2fdda35cae7c7de9d725"} Mar 20 15:06:41 crc kubenswrapper[4764]: I0320 15:06:41.170848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-bbvxz" event={"ID":"bbdf4541-5578-4f99-b7fb-bc3be4cd939a","Type":"ContainerStarted","Data":"b6f8e7fb6f9df3eadb1a236796458e2653c86e0e09ec7fe7a51b28baed03412f"} Mar 20 15:06:41 crc kubenswrapper[4764]: I0320 15:06:41.190030 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-bbvxz" podStartSLOduration=3.19001181 podStartE2EDuration="3.19001181s" podCreationTimestamp="2026-03-20 15:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:06:41.186470753 +0000 UTC m=+922.802659892" watchObservedRunningTime="2026-03-20 15:06:41.19001181 +0000 UTC m=+922.806200939" Mar 20 15:06:41 crc kubenswrapper[4764]: I0320 15:06:41.929801 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-memberlist\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:41 crc kubenswrapper[4764]: I0320 15:06:41.934836 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/343e987b-163e-410c-a81d-83f3abb76064-memberlist\") pod \"speaker-vc5vs\" (UID: \"343e987b-163e-410c-a81d-83f3abb76064\") " pod="metallb-system/speaker-vc5vs" Mar 20 15:06:42 crc kubenswrapper[4764]: I0320 15:06:42.165689 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vc5vs" Mar 20 15:06:43 crc kubenswrapper[4764]: I0320 15:06:43.188303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vc5vs" event={"ID":"343e987b-163e-410c-a81d-83f3abb76064","Type":"ContainerStarted","Data":"55cbc5a2b7519596fe7766907182f91325d34006723fa792c2bb46529a84e03d"} Mar 20 15:06:43 crc kubenswrapper[4764]: I0320 15:06:43.188600 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vc5vs" event={"ID":"343e987b-163e-410c-a81d-83f3abb76064","Type":"ContainerStarted","Data":"fa6d79e0eb7ca024c8a24a6e6d6ee9cc3fd3721cf75d712452ea6fe2bb6964fe"} Mar 20 15:06:43 crc kubenswrapper[4764]: I0320 15:06:43.188609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vc5vs" event={"ID":"343e987b-163e-410c-a81d-83f3abb76064","Type":"ContainerStarted","Data":"851a1d37bcab991e03e81d93c686662135bf86b009531f954124066d85b5ae57"} Mar 20 15:06:43 crc kubenswrapper[4764]: I0320 15:06:43.189013 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-vc5vs" Mar 20 15:06:43 crc kubenswrapper[4764]: I0320 15:06:43.209179 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-vc5vs" podStartSLOduration=5.20916122 podStartE2EDuration="5.20916122s" podCreationTimestamp="2026-03-20 15:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:06:43.207904701 +0000 UTC m=+924.824093830" watchObservedRunningTime="2026-03-20 15:06:43.20916122 +0000 UTC m=+924.825350349" Mar 20 15:06:46 crc kubenswrapper[4764]: I0320 15:06:46.221279 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" event={"ID":"797c71ba-f218-4ebc-9109-c1a8e36a1f75","Type":"ContainerStarted","Data":"4fc2d360591ee1862a72c1373357531237f808d1a4e3c6cf77d6f79f1e6a895f"} Mar 20 15:06:46 crc kubenswrapper[4764]: I0320 15:06:46.221909 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" Mar 20 15:06:46 crc kubenswrapper[4764]: I0320 15:06:46.237364 4764 generic.go:334] "Generic (PLEG): container finished" podID="3767b150-6238-4386-9155-4e198e0ee2d2" containerID="1d7c93f62daef880d5d89c0a9fb5acec1da286a243b43896ebfdd8acfc4a7b37" exitCode=0 Mar 20 15:06:46 crc kubenswrapper[4764]: I0320 15:06:46.237480 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r9q5r" event={"ID":"3767b150-6238-4386-9155-4e198e0ee2d2","Type":"ContainerDied","Data":"1d7c93f62daef880d5d89c0a9fb5acec1da286a243b43896ebfdd8acfc4a7b37"} Mar 20 15:06:46 crc kubenswrapper[4764]: I0320 15:06:46.243704 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" podStartSLOduration=1.122789787 podStartE2EDuration="8.24368614s" podCreationTimestamp="2026-03-20 15:06:38 +0000 UTC" firstStartedPulling="2026-03-20 15:06:38.824276051 +0000 UTC m=+920.440465190" lastFinishedPulling="2026-03-20 15:06:45.945172374 +0000 UTC m=+927.561361543" observedRunningTime="2026-03-20 15:06:46.240553555 +0000 UTC m=+927.856742694" watchObservedRunningTime="2026-03-20 15:06:46.24368614 +0000 UTC m=+927.859875269" Mar 20 15:06:47 crc kubenswrapper[4764]: I0320 15:06:47.246996 4764 generic.go:334] "Generic (PLEG): container finished" podID="3767b150-6238-4386-9155-4e198e0ee2d2" containerID="95205f71e2aa0861b89263676c156cd1a201c512b4e96a65c51c7dc63a0a1f74" exitCode=0 Mar 20 15:06:47 crc kubenswrapper[4764]: I0320 15:06:47.247098 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r9q5r" event={"ID":"3767b150-6238-4386-9155-4e198e0ee2d2","Type":"ContainerDied","Data":"95205f71e2aa0861b89263676c156cd1a201c512b4e96a65c51c7dc63a0a1f74"} Mar 20 15:06:48 crc kubenswrapper[4764]: I0320 15:06:48.261250 4764 generic.go:334] "Generic (PLEG): container finished" podID="3767b150-6238-4386-9155-4e198e0ee2d2" containerID="cacab6b507311ef61af06437f0473ba0d61d0b7abc30d7fb46dcedcdaa05a60b" exitCode=0 Mar 20 15:06:48 crc kubenswrapper[4764]: I0320 15:06:48.261500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r9q5r" event={"ID":"3767b150-6238-4386-9155-4e198e0ee2d2","Type":"ContainerDied","Data":"cacab6b507311ef61af06437f0473ba0d61d0b7abc30d7fb46dcedcdaa05a60b"} Mar 20 15:06:49 crc kubenswrapper[4764]: I0320 15:06:49.274756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r9q5r" event={"ID":"3767b150-6238-4386-9155-4e198e0ee2d2","Type":"ContainerStarted","Data":"175f01f34b6759f37a0caff449bcf7a02b169668d1ecea7d5b3fe67835f10982"} Mar 20 15:06:49 crc kubenswrapper[4764]: I0320 15:06:49.275050 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r9q5r" event={"ID":"3767b150-6238-4386-9155-4e198e0ee2d2","Type":"ContainerStarted","Data":"e668cc21305ebf94c5d8825888a172e62a15b44771206ac6a07dfa087b545268"} Mar 20 15:06:49 crc kubenswrapper[4764]: I0320 15:06:49.275062 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r9q5r" event={"ID":"3767b150-6238-4386-9155-4e198e0ee2d2","Type":"ContainerStarted","Data":"5de13d6341806592ab8ec06d8c8255f584d2ebd94ba39ef3273cc26342c23ec8"} Mar 20 15:06:49 crc kubenswrapper[4764]: I0320 15:06:49.275071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r9q5r" event={"ID":"3767b150-6238-4386-9155-4e198e0ee2d2","Type":"ContainerStarted","Data":"7392740ee45cf05f6f0af4a1dd32ab0e641bef268d36e7aaf36ea73927769fe3"} Mar 20 15:06:50 crc kubenswrapper[4764]: I0320 15:06:50.049178 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-bbvxz" Mar 20 15:06:50 crc kubenswrapper[4764]: I0320 15:06:50.288793 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r9q5r" event={"ID":"3767b150-6238-4386-9155-4e198e0ee2d2","Type":"ContainerStarted","Data":"f8741bc3f307dac1921560b0ebbed6dc975620357fd9f0813cd65665f1f0a94a"} Mar 20 15:06:50 crc kubenswrapper[4764]: I0320 15:06:50.288842 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r9q5r" event={"ID":"3767b150-6238-4386-9155-4e198e0ee2d2","Type":"ContainerStarted","Data":"bdeb5309890a2c43323f8f7f19916a729a2b2f592b9098adeacd89d4a0075760"} Mar 20 15:06:50 crc kubenswrapper[4764]: I0320 15:06:50.289984 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:50 crc kubenswrapper[4764]: I0320 15:06:50.315796 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-r9q5r" podStartSLOduration=4.983202169 podStartE2EDuration="12.315777262s" podCreationTimestamp="2026-03-20 15:06:38 +0000 UTC" firstStartedPulling="2026-03-20 15:06:38.565553889 +0000 UTC m=+920.181743018" lastFinishedPulling="2026-03-20 15:06:45.898128952 +0000 UTC m=+927.514318111" observedRunningTime="2026-03-20 15:06:50.310180783 +0000 UTC m=+931.926369932" watchObservedRunningTime="2026-03-20 15:06:50.315777262 +0000 UTC m=+931.931966401" Mar 20 15:06:52 crc kubenswrapper[4764]: I0320 15:06:52.171365 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-vc5vs" Mar 20 15:06:53 crc kubenswrapper[4764]: I0320 15:06:53.433600 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:53 crc kubenswrapper[4764]: I0320 15:06:53.473881 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:55 crc kubenswrapper[4764]: I0320 15:06:55.533662 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-87s98"] Mar 20 15:06:55 crc kubenswrapper[4764]: I0320 15:06:55.534768 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-87s98" Mar 20 15:06:55 crc kubenswrapper[4764]: I0320 15:06:55.537735 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 15:06:55 crc kubenswrapper[4764]: I0320 15:06:55.538115 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 15:06:55 crc kubenswrapper[4764]: I0320 15:06:55.539696 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2pbbt" Mar 20 15:06:55 crc kubenswrapper[4764]: I0320 15:06:55.629923 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-87s98"] Mar 20 15:06:55 crc kubenswrapper[4764]: I0320 15:06:55.632554 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgv5l\" (UniqueName: \"kubernetes.io/projected/68d048b9-180a-435c-ac78-a8ce29ab79d3-kube-api-access-fgv5l\") pod \"openstack-operator-index-87s98\" (UID: \"68d048b9-180a-435c-ac78-a8ce29ab79d3\") " pod="openstack-operators/openstack-operator-index-87s98" Mar 20 15:06:55 crc kubenswrapper[4764]: I0320 15:06:55.733404 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgv5l\" (UniqueName: \"kubernetes.io/projected/68d048b9-180a-435c-ac78-a8ce29ab79d3-kube-api-access-fgv5l\") pod \"openstack-operator-index-87s98\" (UID: \"68d048b9-180a-435c-ac78-a8ce29ab79d3\") " pod="openstack-operators/openstack-operator-index-87s98" Mar 20 15:06:55 crc kubenswrapper[4764]: I0320 15:06:55.757288 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgv5l\" (UniqueName: \"kubernetes.io/projected/68d048b9-180a-435c-ac78-a8ce29ab79d3-kube-api-access-fgv5l\") pod \"openstack-operator-index-87s98\" (UID: \"68d048b9-180a-435c-ac78-a8ce29ab79d3\") " pod="openstack-operators/openstack-operator-index-87s98" Mar 20 15:06:55 crc kubenswrapper[4764]: I0320 15:06:55.906855 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-87s98" Mar 20 15:06:56 crc kubenswrapper[4764]: I0320 15:06:56.115115 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-87s98"] Mar 20 15:06:56 crc kubenswrapper[4764]: I0320 15:06:56.342064 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-87s98" event={"ID":"68d048b9-180a-435c-ac78-a8ce29ab79d3","Type":"ContainerStarted","Data":"da8847995a209e485cd920b0652b4cdf0758bd6b6445df9368e62550aa55d07e"} Mar 20 15:06:58 crc kubenswrapper[4764]: I0320 15:06:58.426004 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-zwrnf" Mar 20 15:06:58 crc kubenswrapper[4764]: I0320 15:06:58.436668 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-r9q5r" Mar 20 15:06:58 crc kubenswrapper[4764]: I0320 15:06:58.900463 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-87s98"] Mar 20 15:06:59 crc kubenswrapper[4764]: I0320 15:06:59.507916 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-slfdz"] Mar 20 15:06:59 crc kubenswrapper[4764]: I0320 15:06:59.508781 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-slfdz" Mar 20 15:06:59 crc kubenswrapper[4764]: I0320 15:06:59.517398 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-slfdz"] Mar 20 15:06:59 crc kubenswrapper[4764]: I0320 15:06:59.589213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwhhm\" (UniqueName: \"kubernetes.io/projected/168e96d7-6450-4a0f-95ba-d9d42d7ab187-kube-api-access-jwhhm\") pod \"openstack-operator-index-slfdz\" (UID: \"168e96d7-6450-4a0f-95ba-d9d42d7ab187\") " pod="openstack-operators/openstack-operator-index-slfdz" Mar 20 15:06:59 crc kubenswrapper[4764]: I0320 15:06:59.690504 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwhhm\" (UniqueName: \"kubernetes.io/projected/168e96d7-6450-4a0f-95ba-d9d42d7ab187-kube-api-access-jwhhm\") pod \"openstack-operator-index-slfdz\" (UID: \"168e96d7-6450-4a0f-95ba-d9d42d7ab187\") " pod="openstack-operators/openstack-operator-index-slfdz" Mar 20 15:06:59 crc kubenswrapper[4764]: I0320 15:06:59.709611 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwhhm\" (UniqueName: \"kubernetes.io/projected/168e96d7-6450-4a0f-95ba-d9d42d7ab187-kube-api-access-jwhhm\") pod \"openstack-operator-index-slfdz\" (UID: \"168e96d7-6450-4a0f-95ba-d9d42d7ab187\") " pod="openstack-operators/openstack-operator-index-slfdz" Mar 20 15:06:59 crc kubenswrapper[4764]: I0320 15:06:59.853211 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-slfdz" Mar 20 15:07:03 crc kubenswrapper[4764]: I0320 15:07:03.244801 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-slfdz"] Mar 20 15:07:04 crc kubenswrapper[4764]: I0320 15:07:04.405973 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-slfdz" event={"ID":"168e96d7-6450-4a0f-95ba-d9d42d7ab187","Type":"ContainerStarted","Data":"16369d59a762e68227e9d2203ed770e17fd1b6ca662c24e2a246ca02f10bf585"} Mar 20 15:07:05 crc kubenswrapper[4764]: I0320 15:07:05.413753 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-slfdz" event={"ID":"168e96d7-6450-4a0f-95ba-d9d42d7ab187","Type":"ContainerStarted","Data":"72d3d5f3973c81e8c8fada0c6e3151b2d696559dc09b413ebbd47e9e712359de"} Mar 20 15:07:05 crc kubenswrapper[4764]: I0320 15:07:05.416457 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-87s98" event={"ID":"68d048b9-180a-435c-ac78-a8ce29ab79d3","Type":"ContainerStarted","Data":"0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c"} Mar 20 15:07:05 crc kubenswrapper[4764]: I0320 15:07:05.416508 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-87s98" podUID="68d048b9-180a-435c-ac78-a8ce29ab79d3" containerName="registry-server" containerID="cri-o://0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c" gracePeriod=2 Mar 20 15:07:05 crc kubenswrapper[4764]: I0320 15:07:05.431970 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-slfdz" podStartSLOduration=5.466712101 podStartE2EDuration="6.431948275s" podCreationTimestamp="2026-03-20 15:06:59 +0000 UTC" firstStartedPulling="2026-03-20 15:07:03.473282954 +0000 UTC m=+945.089472083" lastFinishedPulling="2026-03-20 15:07:04.438519098 +0000 UTC m=+946.054708257" observedRunningTime="2026-03-20 15:07:05.428037536 +0000 UTC m=+947.044226705" watchObservedRunningTime="2026-03-20 15:07:05.431948275 +0000 UTC m=+947.048137404" Mar 20 15:07:05 crc kubenswrapper[4764]: I0320 15:07:05.446124 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-87s98" podStartSLOduration=2.138393527 podStartE2EDuration="10.446108193s" podCreationTimestamp="2026-03-20 15:06:55 +0000 UTC" firstStartedPulling="2026-03-20 15:06:56.129422 +0000 UTC m=+937.745611149" lastFinishedPulling="2026-03-20 15:07:04.437136656 +0000 UTC m=+946.053325815" observedRunningTime="2026-03-20 15:07:05.443348459 +0000 UTC m=+947.059537598" watchObservedRunningTime="2026-03-20 15:07:05.446108193 +0000 UTC m=+947.062297322" Mar 20 15:07:05 crc kubenswrapper[4764]: I0320 15:07:05.822578 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-87s98" Mar 20 15:07:05 crc kubenswrapper[4764]: I0320 15:07:05.903899 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgv5l\" (UniqueName: \"kubernetes.io/projected/68d048b9-180a-435c-ac78-a8ce29ab79d3-kube-api-access-fgv5l\") pod \"68d048b9-180a-435c-ac78-a8ce29ab79d3\" (UID: \"68d048b9-180a-435c-ac78-a8ce29ab79d3\") " Mar 20 15:07:05 crc kubenswrapper[4764]: I0320 15:07:05.909279 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d048b9-180a-435c-ac78-a8ce29ab79d3-kube-api-access-fgv5l" (OuterVolumeSpecName: "kube-api-access-fgv5l") pod "68d048b9-180a-435c-ac78-a8ce29ab79d3" (UID: "68d048b9-180a-435c-ac78-a8ce29ab79d3"). InnerVolumeSpecName "kube-api-access-fgv5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:07:06 crc kubenswrapper[4764]: I0320 15:07:06.005686 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgv5l\" (UniqueName: \"kubernetes.io/projected/68d048b9-180a-435c-ac78-a8ce29ab79d3-kube-api-access-fgv5l\") on node \"crc\" DevicePath \"\"" Mar 20 15:07:06 crc kubenswrapper[4764]: I0320 15:07:06.425550 4764 generic.go:334] "Generic (PLEG): container finished" podID="68d048b9-180a-435c-ac78-a8ce29ab79d3" containerID="0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c" exitCode=0 Mar 20 15:07:06 crc kubenswrapper[4764]: I0320 15:07:06.425662 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-87s98" Mar 20 15:07:06 crc kubenswrapper[4764]: I0320 15:07:06.425651 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-87s98" event={"ID":"68d048b9-180a-435c-ac78-a8ce29ab79d3","Type":"ContainerDied","Data":"0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c"} Mar 20 15:07:06 crc kubenswrapper[4764]: I0320 15:07:06.425982 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-87s98" event={"ID":"68d048b9-180a-435c-ac78-a8ce29ab79d3","Type":"ContainerDied","Data":"da8847995a209e485cd920b0652b4cdf0758bd6b6445df9368e62550aa55d07e"} Mar 20 15:07:06 crc kubenswrapper[4764]: I0320 15:07:06.426012 4764 scope.go:117] "RemoveContainer" containerID="0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c" Mar 20 15:07:06 crc kubenswrapper[4764]: I0320 15:07:06.450339 4764 scope.go:117] "RemoveContainer" containerID="0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c" Mar 20 15:07:06 crc kubenswrapper[4764]: E0320 15:07:06.450945 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c\": container with ID starting with 0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c not found: ID does not exist" containerID="0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c" Mar 20 15:07:06 crc kubenswrapper[4764]: I0320 15:07:06.450998 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c"} err="failed to get container status \"0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c\": rpc error: code = NotFound desc = could not find container \"0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c\": container with ID starting with 0fa4cc14b810de80f9847c520fe7330ae7cd37c8ba493bdaf69abf7c9fd08a0c not found: ID does not exist" Mar 20 15:07:06 crc kubenswrapper[4764]: I0320 15:07:06.473758 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-87s98"] Mar 20 15:07:06 crc kubenswrapper[4764]: I0320 15:07:06.481661 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-87s98"] Mar 20 15:07:07 crc kubenswrapper[4764]: I0320 15:07:07.136490 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d048b9-180a-435c-ac78-a8ce29ab79d3" path="/var/lib/kubelet/pods/68d048b9-180a-435c-ac78-a8ce29ab79d3/volumes" Mar 20 15:07:08 crc kubenswrapper[4764]: I0320 15:07:08.443849 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:07:08 crc kubenswrapper[4764]: I0320 15:07:08.444556 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:07:09 crc kubenswrapper[4764]: I0320 15:07:09.853970 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-slfdz" Mar 20 15:07:09 crc kubenswrapper[4764]: I0320 15:07:09.854627 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-slfdz" Mar 20 15:07:09 crc kubenswrapper[4764]: I0320 15:07:09.892441 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-slfdz" Mar 20 15:07:10 crc kubenswrapper[4764]: I0320 15:07:10.520044 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-slfdz" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.789976 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l"] Mar 20 15:07:16 crc kubenswrapper[4764]: E0320 15:07:16.790603 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d048b9-180a-435c-ac78-a8ce29ab79d3" containerName="registry-server" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.790618 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d048b9-180a-435c-ac78-a8ce29ab79d3" containerName="registry-server" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.790762 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d048b9-180a-435c-ac78-a8ce29ab79d3" containerName="registry-server" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.791743 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.794072 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-djt6g" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.804123 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l"] Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.868465 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-bundle\") pod \"a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l\" (UID: \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\") " pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.868707 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-util\") pod \"a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l\" (UID: \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\") " pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.868947 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc2z7\" (UniqueName: \"kubernetes.io/projected/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-kube-api-access-fc2z7\") pod \"a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l\" (UID: \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\") " pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.970356 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc2z7\" (UniqueName: \"kubernetes.io/projected/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-kube-api-access-fc2z7\") pod \"a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l\" (UID: \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\") " pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.970435 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-bundle\") pod \"a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l\" (UID: \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\") " pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.970494 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-util\") pod \"a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l\" (UID: \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\") " pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.971066 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-util\") pod \"a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l\" (UID: \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\") " pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.971066 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-bundle\") pod \"a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l\" (UID: \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\") " pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:16 crc kubenswrapper[4764]: I0320 15:07:16.990912 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc2z7\" (UniqueName: \"kubernetes.io/projected/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-kube-api-access-fc2z7\") pod \"a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l\" (UID: \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\") " pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:17 crc kubenswrapper[4764]: I0320 15:07:17.115022 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:17 crc kubenswrapper[4764]: I0320 15:07:17.330344 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l"] Mar 20 15:07:17 crc kubenswrapper[4764]: I0320 15:07:17.550549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" event={"ID":"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe","Type":"ContainerStarted","Data":"7249e455c1714253c9617089fbe09af7d3a86d17b4800c8fa7fcf3b68bba6b66"} Mar 20 15:07:17 crc kubenswrapper[4764]: I0320 15:07:17.550610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" event={"ID":"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe","Type":"ContainerStarted","Data":"6dcd732b66a04fbcd30a7dc736a4bf8a3991844933a03476952a2dc5720d943e"} Mar 20 15:07:18 crc kubenswrapper[4764]: I0320 15:07:18.560956 4764 generic.go:334] "Generic (PLEG): container finished" podID="2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" containerID="7249e455c1714253c9617089fbe09af7d3a86d17b4800c8fa7fcf3b68bba6b66" exitCode=0 Mar 20 15:07:18 crc kubenswrapper[4764]: I0320 15:07:18.561021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" event={"ID":"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe","Type":"ContainerDied","Data":"7249e455c1714253c9617089fbe09af7d3a86d17b4800c8fa7fcf3b68bba6b66"} Mar 20 15:07:20 crc kubenswrapper[4764]: I0320 15:07:20.579250 4764 generic.go:334] "Generic (PLEG): container finished" podID="2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" containerID="e8386feecf4f40965d4eacea6476ca16d09a4b799e6f0d88a08630698c62b9ba" exitCode=0 Mar 20 15:07:20 crc kubenswrapper[4764]: I0320 15:07:20.579314 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" event={"ID":"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe","Type":"ContainerDied","Data":"e8386feecf4f40965d4eacea6476ca16d09a4b799e6f0d88a08630698c62b9ba"} Mar 20 15:07:21 crc kubenswrapper[4764]: I0320 15:07:21.590927 4764 generic.go:334] "Generic (PLEG): container finished" podID="2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" containerID="3a41b7b5f23b0c5595904419c0de1927372d595e43e8a6c0ead3eb392ed4796d" exitCode=0 Mar 20 15:07:21 crc kubenswrapper[4764]: I0320 15:07:21.590985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" event={"ID":"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe","Type":"ContainerDied","Data":"3a41b7b5f23b0c5595904419c0de1927372d595e43e8a6c0ead3eb392ed4796d"} Mar 20 15:07:22 crc kubenswrapper[4764]: I0320 15:07:22.944842 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:23 crc kubenswrapper[4764]: I0320 15:07:23.057817 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc2z7\" (UniqueName: \"kubernetes.io/projected/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-kube-api-access-fc2z7\") pod \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\" (UID: \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\") " Mar 20 15:07:23 crc kubenswrapper[4764]: I0320 15:07:23.057883 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-util\") pod \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\" (UID: \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\") " Mar 20 15:07:23 crc kubenswrapper[4764]: I0320 15:07:23.057921 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-bundle\") pod \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\" (UID: \"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe\") " Mar 20 15:07:23 crc kubenswrapper[4764]: I0320 15:07:23.058807 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-bundle" (OuterVolumeSpecName: "bundle") pod "2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" (UID: "2c0a59ef-c6d3-40b8-8134-4223ae9d69fe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:07:23 crc kubenswrapper[4764]: I0320 15:07:23.064984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-kube-api-access-fc2z7" (OuterVolumeSpecName: "kube-api-access-fc2z7") pod "2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" (UID: "2c0a59ef-c6d3-40b8-8134-4223ae9d69fe"). InnerVolumeSpecName "kube-api-access-fc2z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:07:23 crc kubenswrapper[4764]: I0320 15:07:23.159169 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc2z7\" (UniqueName: \"kubernetes.io/projected/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-kube-api-access-fc2z7\") on node \"crc\" DevicePath \"\"" Mar 20 15:07:23 crc kubenswrapper[4764]: I0320 15:07:23.159218 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:07:23 crc kubenswrapper[4764]: I0320 15:07:23.309457 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-util" (OuterVolumeSpecName: "util") pod "2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" (UID: "2c0a59ef-c6d3-40b8-8134-4223ae9d69fe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:07:23 crc kubenswrapper[4764]: I0320 15:07:23.362293 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c0a59ef-c6d3-40b8-8134-4223ae9d69fe-util\") on node \"crc\" DevicePath \"\"" Mar 20 15:07:23 crc kubenswrapper[4764]: I0320 15:07:23.612701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" event={"ID":"2c0a59ef-c6d3-40b8-8134-4223ae9d69fe","Type":"ContainerDied","Data":"6dcd732b66a04fbcd30a7dc736a4bf8a3991844933a03476952a2dc5720d943e"} Mar 20 15:07:23 crc kubenswrapper[4764]: I0320 15:07:23.612747 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dcd732b66a04fbcd30a7dc736a4bf8a3991844933a03476952a2dc5720d943e" Mar 20 15:07:23 crc kubenswrapper[4764]: I0320 15:07:23.612784 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l" Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.362268 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7"] Mar 20 15:07:29 crc kubenswrapper[4764]: E0320 15:07:29.362854 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" containerName="extract" Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.362869 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" containerName="extract" Mar 20 15:07:29 crc kubenswrapper[4764]: E0320 15:07:29.362892 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" containerName="pull" Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.362901 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" containerName="pull" Mar 20 15:07:29 crc kubenswrapper[4764]: E0320 15:07:29.362917 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" containerName="util" Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.362925 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" containerName="util" Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.363070 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c0a59ef-c6d3-40b8-8134-4223ae9d69fe" containerName="extract" Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.363623 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7" Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.366312 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-gmp97" Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.384365 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7"] Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.455166 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz65x\" (UniqueName: \"kubernetes.io/projected/53c82166-2b9f-4340-8eb1-c95569faca61-kube-api-access-tz65x\") pod \"openstack-operator-controller-init-74bc6f6bf8-hk7p7\" (UID: \"53c82166-2b9f-4340-8eb1-c95569faca61\") " pod="openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7" Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.556602 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz65x\" (UniqueName: \"kubernetes.io/projected/53c82166-2b9f-4340-8eb1-c95569faca61-kube-api-access-tz65x\") pod \"openstack-operator-controller-init-74bc6f6bf8-hk7p7\" (UID: \"53c82166-2b9f-4340-8eb1-c95569faca61\") " pod="openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7" Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.584372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz65x\" (UniqueName: \"kubernetes.io/projected/53c82166-2b9f-4340-8eb1-c95569faca61-kube-api-access-tz65x\") pod \"openstack-operator-controller-init-74bc6f6bf8-hk7p7\" (UID: \"53c82166-2b9f-4340-8eb1-c95569faca61\") " pod="openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7" Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.684649 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7" Mar 20 15:07:29 crc kubenswrapper[4764]: I0320 15:07:29.989105 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7"] Mar 20 15:07:30 crc kubenswrapper[4764]: I0320 15:07:30.669740 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7" event={"ID":"53c82166-2b9f-4340-8eb1-c95569faca61","Type":"ContainerStarted","Data":"e80158529567aec4f62d5d24171264f5a2bea748baa7b0dd7cb38d1327ccbe85"} Mar 20 15:07:37 crc kubenswrapper[4764]: I0320 15:07:37.718865 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7" event={"ID":"53c82166-2b9f-4340-8eb1-c95569faca61","Type":"ContainerStarted","Data":"462e2b4263adda0e137d7e3549e702260d23c44099b7d2fc4d5633a4ee447204"} Mar 20 15:07:37 crc kubenswrapper[4764]: I0320 15:07:37.719594 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7" Mar 20 15:07:37 crc kubenswrapper[4764]: I0320 15:07:37.778357 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7" podStartSLOduration=1.9675427079999999 podStartE2EDuration="8.778322096s" podCreationTimestamp="2026-03-20 15:07:29 +0000 UTC" firstStartedPulling="2026-03-20 15:07:29.998894575 +0000 UTC m=+971.615083704" lastFinishedPulling="2026-03-20 15:07:36.809673953 +0000 UTC m=+978.425863092" observedRunningTime="2026-03-20 15:07:37.774872611 +0000 UTC m=+979.391061780" watchObservedRunningTime="2026-03-20 15:07:37.778322096 +0000 UTC m=+979.394511265" Mar 20 15:07:38 crc kubenswrapper[4764]: I0320 15:07:38.444299 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:07:38 crc kubenswrapper[4764]: I0320 15:07:38.444434 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:07:49 crc kubenswrapper[4764]: I0320 15:07:49.688028 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-74bc6f6bf8-hk7p7" Mar 20 15:08:00 crc kubenswrapper[4764]: I0320 15:08:00.146935 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566988-bxvzm"] Mar 20 15:08:00 crc kubenswrapper[4764]: I0320 15:08:00.148759 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566988-bxvzm" Mar 20 15:08:00 crc kubenswrapper[4764]: I0320 15:08:00.151373 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:08:00 crc kubenswrapper[4764]: I0320 15:08:00.156271 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:08:00 crc kubenswrapper[4764]: I0320 15:08:00.158801 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566988-bxvzm"] Mar 20 15:08:00 crc kubenswrapper[4764]: I0320 15:08:00.159516 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:08:00 crc kubenswrapper[4764]: I0320 15:08:00.190767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72fs6\" (UniqueName: \"kubernetes.io/projected/3c7ef967-1634-40b5-a3eb-972372a02741-kube-api-access-72fs6\") pod \"auto-csr-approver-29566988-bxvzm\" (UID: \"3c7ef967-1634-40b5-a3eb-972372a02741\") " pod="openshift-infra/auto-csr-approver-29566988-bxvzm" Mar 20 15:08:00 crc kubenswrapper[4764]: I0320 15:08:00.293105 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72fs6\" (UniqueName: \"kubernetes.io/projected/3c7ef967-1634-40b5-a3eb-972372a02741-kube-api-access-72fs6\") pod \"auto-csr-approver-29566988-bxvzm\" (UID: \"3c7ef967-1634-40b5-a3eb-972372a02741\") " pod="openshift-infra/auto-csr-approver-29566988-bxvzm" Mar 20 15:08:00 crc kubenswrapper[4764]: I0320 15:08:00.318993 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72fs6\" (UniqueName: \"kubernetes.io/projected/3c7ef967-1634-40b5-a3eb-972372a02741-kube-api-access-72fs6\") pod \"auto-csr-approver-29566988-bxvzm\" (UID: \"3c7ef967-1634-40b5-a3eb-972372a02741\") " pod="openshift-infra/auto-csr-approver-29566988-bxvzm" Mar 20 15:08:00 crc kubenswrapper[4764]: I0320 15:08:00.469185 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566988-bxvzm" Mar 20 15:08:00 crc kubenswrapper[4764]: I0320 15:08:00.688766 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566988-bxvzm"] Mar 20 15:08:01 crc kubenswrapper[4764]: I0320 15:08:01.512355 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566988-bxvzm" event={"ID":"3c7ef967-1634-40b5-a3eb-972372a02741","Type":"ContainerStarted","Data":"d2bdb9a40b0462eb0934e7ed909d76cc8a2396960bef64a1e24ab61bc6fb948b"} Mar 20 15:08:07 crc kubenswrapper[4764]: I0320 15:08:07.548754 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566988-bxvzm" event={"ID":"3c7ef967-1634-40b5-a3eb-972372a02741","Type":"ContainerStarted","Data":"b9cc45be1dd166b14b764a39c4a00ef0c78a8b43b1476e3e81b4cf854d2b0758"} Mar 20 15:08:07 crc kubenswrapper[4764]: I0320 15:08:07.569154 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566988-bxvzm" podStartSLOduration=1.611760668 podStartE2EDuration="7.569122715s" podCreationTimestamp="2026-03-20 15:08:00 +0000 UTC" firstStartedPulling="2026-03-20 15:08:00.698783656 +0000 UTC m=+1002.314972795" lastFinishedPulling="2026-03-20 15:08:06.656145713 +0000 UTC m=+1008.272334842" observedRunningTime="2026-03-20 15:08:07.565660359 +0000 UTC m=+1009.181849488" watchObservedRunningTime="2026-03-20 15:08:07.569122715 +0000 UTC m=+1009.185311834" Mar 20 15:08:08 crc kubenswrapper[4764]: I0320 15:08:08.443300 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:08:08 crc kubenswrapper[4764]: I0320 15:08:08.443872 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:08:08 crc kubenswrapper[4764]: I0320 15:08:08.443952 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 15:08:08 crc kubenswrapper[4764]: I0320 15:08:08.445038 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99ce91acea5a3e1ed101da87e85dacfd4e4d5333d6ac9096a602d551d9d17b34"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:08:08 crc kubenswrapper[4764]: I0320 15:08:08.445166 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://99ce91acea5a3e1ed101da87e85dacfd4e4d5333d6ac9096a602d551d9d17b34" gracePeriod=600 Mar 20 15:08:08 crc kubenswrapper[4764]: I0320 15:08:08.559144 4764 generic.go:334] "Generic (PLEG): container finished" podID="3c7ef967-1634-40b5-a3eb-972372a02741" containerID="b9cc45be1dd166b14b764a39c4a00ef0c78a8b43b1476e3e81b4cf854d2b0758" exitCode=0 Mar 20 15:08:08 crc kubenswrapper[4764]: I0320 15:08:08.559185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566988-bxvzm" event={"ID":"3c7ef967-1634-40b5-a3eb-972372a02741","Type":"ContainerDied","Data":"b9cc45be1dd166b14b764a39c4a00ef0c78a8b43b1476e3e81b4cf854d2b0758"} Mar 20 15:08:09 crc kubenswrapper[4764]: I0320 15:08:09.572718 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="99ce91acea5a3e1ed101da87e85dacfd4e4d5333d6ac9096a602d551d9d17b34" exitCode=0 Mar 20 15:08:09 crc kubenswrapper[4764]: I0320 15:08:09.572819 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"99ce91acea5a3e1ed101da87e85dacfd4e4d5333d6ac9096a602d551d9d17b34"} Mar 20 15:08:09 crc kubenswrapper[4764]: I0320 15:08:09.573450 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"474d025340a960c22301e41eab332b831f75f8273d6153efd902506c422faa11"} Mar 20 15:08:09 crc kubenswrapper[4764]: I0320 15:08:09.573530 4764 scope.go:117] "RemoveContainer" containerID="d0a0145de00f1c32456f8ccedac5f6e372476de1dec21fbd6f506f2ab08b9e04" Mar 20 15:08:09 crc kubenswrapper[4764]: I0320 15:08:09.904860 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566988-bxvzm" Mar 20 15:08:10 crc kubenswrapper[4764]: I0320 15:08:10.080806 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72fs6\" (UniqueName: \"kubernetes.io/projected/3c7ef967-1634-40b5-a3eb-972372a02741-kube-api-access-72fs6\") pod \"3c7ef967-1634-40b5-a3eb-972372a02741\" (UID: \"3c7ef967-1634-40b5-a3eb-972372a02741\") " Mar 20 15:08:10 crc kubenswrapper[4764]: I0320 15:08:10.087760 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c7ef967-1634-40b5-a3eb-972372a02741-kube-api-access-72fs6" (OuterVolumeSpecName: "kube-api-access-72fs6") pod "3c7ef967-1634-40b5-a3eb-972372a02741" (UID: "3c7ef967-1634-40b5-a3eb-972372a02741"). InnerVolumeSpecName "kube-api-access-72fs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:08:10 crc kubenswrapper[4764]: I0320 15:08:10.182583 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72fs6\" (UniqueName: \"kubernetes.io/projected/3c7ef967-1634-40b5-a3eb-972372a02741-kube-api-access-72fs6\") on node \"crc\" DevicePath \"\"" Mar 20 15:08:10 crc kubenswrapper[4764]: I0320 15:08:10.592309 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566988-bxvzm" event={"ID":"3c7ef967-1634-40b5-a3eb-972372a02741","Type":"ContainerDied","Data":"d2bdb9a40b0462eb0934e7ed909d76cc8a2396960bef64a1e24ab61bc6fb948b"} Mar 20 15:08:10 crc kubenswrapper[4764]: I0320 15:08:10.592621 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2bdb9a40b0462eb0934e7ed909d76cc8a2396960bef64a1e24ab61bc6fb948b" Mar 20 15:08:10 crc kubenswrapper[4764]: I0320 15:08:10.592346 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566988-bxvzm" Mar 20 15:08:10 crc kubenswrapper[4764]: I0320 15:08:10.711811 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566982-z52lz"] Mar 20 15:08:10 crc kubenswrapper[4764]: I0320 15:08:10.716101 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566982-z52lz"] Mar 20 15:08:11 crc kubenswrapper[4764]: I0320 15:08:11.134591 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93cfbb2-5a0d-4736-88a1-5658c1030a4b" path="/var/lib/kubelet/pods/f93cfbb2-5a0d-4736-88a1-5658c1030a4b/volumes" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.811823 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm"] Mar 20 15:08:19 crc kubenswrapper[4764]: E0320 15:08:19.812564 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c7ef967-1634-40b5-a3eb-972372a02741" containerName="oc" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.812578 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c7ef967-1634-40b5-a3eb-972372a02741" containerName="oc" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.812712 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c7ef967-1634-40b5-a3eb-972372a02741" containerName="oc" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.813193 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.816983 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-kc6jc" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.817336 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.818072 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.823594 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.823885 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-s9vkw" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.834112 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.846132 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.846817 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.850690 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-nxdkt" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.859144 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.860659 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.867243 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-d9bwf" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.871994 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.899760 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.904647 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.905469 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.908492 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-478qr\" (UniqueName: \"kubernetes.io/projected/639f5d85-78ac-409d-b2ed-b809cb59bfc5-kube-api-access-478qr\") pod \"barbican-operator-controller-manager-59bc569d95-2tslm\" (UID: \"639f5d85-78ac-409d-b2ed-b809cb59bfc5\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.909856 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-p8lbx" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.926508 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.932815 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.933697 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.938754 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xfxtt" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.955033 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.955917 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.958652 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-r87dz" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.958784 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.967125 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.973472 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4"] Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.974216 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.982870 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-74bq7" Mar 20 15:08:19 crc kubenswrapper[4764]: I0320 15:08:19.990844 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.009461 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-478qr\" (UniqueName: \"kubernetes.io/projected/639f5d85-78ac-409d-b2ed-b809cb59bfc5-kube-api-access-478qr\") pod \"barbican-operator-controller-manager-59bc569d95-2tslm\" (UID: \"639f5d85-78ac-409d-b2ed-b809cb59bfc5\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.009506 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjkmc\" (UniqueName: \"kubernetes.io/projected/90b3a567-d460-4c6c-ba32-aaa43faf3add-kube-api-access-jjkmc\") pod \"designate-operator-controller-manager-588d4d986b-xrgp9\" (UID: \"90b3a567-d460-4c6c-ba32-aaa43faf3add\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.009528 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6zwp\" (UniqueName: \"kubernetes.io/projected/7936786e-744e-4fc9-bdd2-3e9ee6c7d3bf-kube-api-access-x6zwp\") pod \"glance-operator-controller-manager-79df6bcc97-gh5rq\" (UID: \"7936786e-744e-4fc9-bdd2-3e9ee6c7d3bf\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.009558 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppxlj\" (UniqueName: \"kubernetes.io/projected/46418e4e-189e-4cc1-8df8-343f53697f68-kube-api-access-ppxlj\") pod \"cinder-operator-controller-manager-8d58dc466-q6wd6\" (UID: \"46418e4e-189e-4cc1-8df8-343f53697f68\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.009580 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pzlk\" (UniqueName: \"kubernetes.io/projected/3b81c755-123b-4caa-bc28-43ce8b672547-kube-api-access-6pzlk\") pod \"heat-operator-controller-manager-67dd5f86f5-5nkz6\" (UID: \"3b81c755-123b-4caa-bc28-43ce8b672547\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.014615 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.017878 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.020735 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2qvwg" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.026813 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.027751 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.029935 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-fhjwn" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.044123 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.052950 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-478qr\" (UniqueName: \"kubernetes.io/projected/639f5d85-78ac-409d-b2ed-b809cb59bfc5-kube-api-access-478qr\") pod \"barbican-operator-controller-manager-59bc569d95-2tslm\" (UID: \"639f5d85-78ac-409d-b2ed-b809cb59bfc5\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.073913 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.080756 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.085162 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.085816 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.087656 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-psstb" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.090834 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.095089 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-24cf9"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.095682 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-24cf9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.099482 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-4vnwh" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.105536 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.106372 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.108972 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-rz9fl" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.111081 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cv7g\" (UniqueName: \"kubernetes.io/projected/c8815a47-3a15-4fcb-a8eb-c72f767b30f0-kube-api-access-7cv7g\") pod \"keystone-operator-controller-manager-768b96df4c-dh2nv\" (UID: \"c8815a47-3a15-4fcb-a8eb-c72f767b30f0\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.111130 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6dwv\" (UniqueName: \"kubernetes.io/projected/8895943a-8d4d-471a-a0db-44eb2e832119-kube-api-access-g6dwv\") pod \"ironic-operator-controller-manager-6f787dddc9-8f4f4\" (UID: \"8895943a-8d4d-471a-a0db-44eb2e832119\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.111153 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwzzr\" (UniqueName: \"kubernetes.io/projected/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-kube-api-access-xwzzr\") pod \"infra-operator-controller-manager-7b9c774f96-kj5l7\" (UID: \"0d25f0a7-3740-4f16-96f3-63b0f587f0a0\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.111189 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjkmc\" (UniqueName: \"kubernetes.io/projected/90b3a567-d460-4c6c-ba32-aaa43faf3add-kube-api-access-jjkmc\") pod \"designate-operator-controller-manager-588d4d986b-xrgp9\" (UID: \"90b3a567-d460-4c6c-ba32-aaa43faf3add\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.111210 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6zwp\" (UniqueName: \"kubernetes.io/projected/7936786e-744e-4fc9-bdd2-3e9ee6c7d3bf-kube-api-access-x6zwp\") pod \"glance-operator-controller-manager-79df6bcc97-gh5rq\" (UID: \"7936786e-744e-4fc9-bdd2-3e9ee6c7d3bf\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.111280 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert\") pod \"infra-operator-controller-manager-7b9c774f96-kj5l7\" (UID: \"0d25f0a7-3740-4f16-96f3-63b0f587f0a0\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.111523 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppxlj\" (UniqueName: \"kubernetes.io/projected/46418e4e-189e-4cc1-8df8-343f53697f68-kube-api-access-ppxlj\") pod \"cinder-operator-controller-manager-8d58dc466-q6wd6\" (UID: \"46418e4e-189e-4cc1-8df8-343f53697f68\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.111550 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pzlk\" (UniqueName: \"kubernetes.io/projected/3b81c755-123b-4caa-bc28-43ce8b672547-kube-api-access-6pzlk\") pod \"heat-operator-controller-manager-67dd5f86f5-5nkz6\" (UID: \"3b81c755-123b-4caa-bc28-43ce8b672547\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.111577 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49q5f\" (UniqueName: \"kubernetes.io/projected/f38f9570-e767-488b-b3d1-97da8b4afa56-kube-api-access-49q5f\") pod \"horizon-operator-controller-manager-8464cc45fb-bh9n8\" (UID: \"f38f9570-e767-488b-b3d1-97da8b4afa56\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.112768 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.113619 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.116316 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.116998 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-4q8fr" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.124067 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.127346 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-24cf9"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.144985 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pzlk\" (UniqueName: \"kubernetes.io/projected/3b81c755-123b-4caa-bc28-43ce8b672547-kube-api-access-6pzlk\") pod \"heat-operator-controller-manager-67dd5f86f5-5nkz6\" (UID: \"3b81c755-123b-4caa-bc28-43ce8b672547\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.145310 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.152950 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppxlj\" (UniqueName: \"kubernetes.io/projected/46418e4e-189e-4cc1-8df8-343f53697f68-kube-api-access-ppxlj\") pod \"cinder-operator-controller-manager-8d58dc466-q6wd6\" (UID: \"46418e4e-189e-4cc1-8df8-343f53697f68\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.157281 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjkmc\" (UniqueName: \"kubernetes.io/projected/90b3a567-d460-4c6c-ba32-aaa43faf3add-kube-api-access-jjkmc\") pod \"designate-operator-controller-manager-588d4d986b-xrgp9\" (UID: \"90b3a567-d460-4c6c-ba32-aaa43faf3add\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.157558 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6zwp\" (UniqueName: \"kubernetes.io/projected/7936786e-744e-4fc9-bdd2-3e9ee6c7d3bf-kube-api-access-x6zwp\") pod \"glance-operator-controller-manager-79df6bcc97-gh5rq\" (UID: \"7936786e-744e-4fc9-bdd2-3e9ee6c7d3bf\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.170773 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.173462 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.185818 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.186126 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.190205 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.193705 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-wcvzt" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.194804 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.212872 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert\") pod \"infra-operator-controller-manager-7b9c774f96-kj5l7\" (UID: \"0d25f0a7-3740-4f16-96f3-63b0f587f0a0\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.212936 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49q5f\" (UniqueName: \"kubernetes.io/projected/f38f9570-e767-488b-b3d1-97da8b4afa56-kube-api-access-49q5f\") pod \"horizon-operator-controller-manager-8464cc45fb-bh9n8\" (UID: \"f38f9570-e767-488b-b3d1-97da8b4afa56\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.212971 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxmj\" (UniqueName: \"kubernetes.io/projected/0410685f-0ab6-43db-9831-f6cd0b0e7f6f-kube-api-access-6cxmj\") pod \"neutron-operator-controller-manager-767865f676-24cf9\" (UID: \"0410685f-0ab6-43db-9831-f6cd0b0e7f6f\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-24cf9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.212996 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlx48\" (UniqueName: \"kubernetes.io/projected/8a040614-ef32-4b0d-a5ae-a3336d26bc71-kube-api-access-jlx48\") pod \"mariadb-operator-controller-manager-67ccfc9778-sdqrr\" (UID: \"8a040614-ef32-4b0d-a5ae-a3336d26bc71\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.213013 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvsz8\" (UniqueName: \"kubernetes.io/projected/238654a5-1849-4f9f-9496-a8e796655b37-kube-api-access-pvsz8\") pod \"manila-operator-controller-manager-55f864c847-hkvw2\" (UID: \"238654a5-1849-4f9f-9496-a8e796655b37\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.213035 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cv7g\" (UniqueName: \"kubernetes.io/projected/c8815a47-3a15-4fcb-a8eb-c72f767b30f0-kube-api-access-7cv7g\") pod \"keystone-operator-controller-manager-768b96df4c-dh2nv\" (UID: \"c8815a47-3a15-4fcb-a8eb-c72f767b30f0\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.213060 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w4v2\" (UniqueName: \"kubernetes.io/projected/f111a23d-c6f2-4ca4-9434-aa20eafdf979-kube-api-access-9w4v2\") pod \"nova-operator-controller-manager-5d488d59fb-7lxwm\" (UID: \"f111a23d-c6f2-4ca4-9434-aa20eafdf979\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.213088 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh4wm\" (UniqueName: \"kubernetes.io/projected/5f0943ff-7d84-467b-9d22-53fdacb1b054-kube-api-access-dh4wm\") pod \"octavia-operator-controller-manager-5b9f45d989-pwjdt\" (UID: \"5f0943ff-7d84-467b-9d22-53fdacb1b054\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.213106 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6dwv\" (UniqueName: \"kubernetes.io/projected/8895943a-8d4d-471a-a0db-44eb2e832119-kube-api-access-g6dwv\") pod \"ironic-operator-controller-manager-6f787dddc9-8f4f4\" (UID: \"8895943a-8d4d-471a-a0db-44eb2e832119\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.213123 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwzzr\" (UniqueName: \"kubernetes.io/projected/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-kube-api-access-xwzzr\") pod \"infra-operator-controller-manager-7b9c774f96-kj5l7\" (UID: \"0d25f0a7-3740-4f16-96f3-63b0f587f0a0\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:20 crc kubenswrapper[4764]: E0320 15:08:20.213576 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 15:08:20 crc kubenswrapper[4764]: E0320 15:08:20.213638 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert podName:0d25f0a7-3740-4f16-96f3-63b0f587f0a0 nodeName:}" failed. No retries permitted until 2026-03-20 15:08:20.713620064 +0000 UTC m=+1022.329809193 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert") pod "infra-operator-controller-manager-7b9c774f96-kj5l7" (UID: "0d25f0a7-3740-4f16-96f3-63b0f587f0a0") : secret "infra-operator-webhook-server-cert" not found Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.221220 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.236828 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.238222 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6dwv\" (UniqueName: \"kubernetes.io/projected/8895943a-8d4d-471a-a0db-44eb2e832119-kube-api-access-g6dwv\") pod \"ironic-operator-controller-manager-6f787dddc9-8f4f4\" (UID: \"8895943a-8d4d-471a-a0db-44eb2e832119\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.240778 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwzzr\" (UniqueName: \"kubernetes.io/projected/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-kube-api-access-xwzzr\") pod \"infra-operator-controller-manager-7b9c774f96-kj5l7\" (UID: \"0d25f0a7-3740-4f16-96f3-63b0f587f0a0\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.246214 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cv7g\" (UniqueName: \"kubernetes.io/projected/c8815a47-3a15-4fcb-a8eb-c72f767b30f0-kube-api-access-7cv7g\") pod \"keystone-operator-controller-manager-768b96df4c-dh2nv\" (UID: \"c8815a47-3a15-4fcb-a8eb-c72f767b30f0\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.252145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49q5f\" (UniqueName: \"kubernetes.io/projected/f38f9570-e767-488b-b3d1-97da8b4afa56-kube-api-access-49q5f\") pod \"horizon-operator-controller-manager-8464cc45fb-bh9n8\" (UID: \"f38f9570-e767-488b-b3d1-97da8b4afa56\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.260334 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.261242 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.262826 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.263366 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-mczqj" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.287503 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.288973 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.292351 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-cwdcw" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.296659 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.313288 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.314219 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.318969 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlx48\" (UniqueName: \"kubernetes.io/projected/8a040614-ef32-4b0d-a5ae-a3336d26bc71-kube-api-access-jlx48\") pod \"mariadb-operator-controller-manager-67ccfc9778-sdqrr\" (UID: \"8a040614-ef32-4b0d-a5ae-a3336d26bc71\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.319007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvsz8\" (UniqueName: \"kubernetes.io/projected/238654a5-1849-4f9f-9496-a8e796655b37-kube-api-access-pvsz8\") pod \"manila-operator-controller-manager-55f864c847-hkvw2\" (UID: \"238654a5-1849-4f9f-9496-a8e796655b37\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.319051 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w4v2\" (UniqueName: \"kubernetes.io/projected/f111a23d-c6f2-4ca4-9434-aa20eafdf979-kube-api-access-9w4v2\") pod \"nova-operator-controller-manager-5d488d59fb-7lxwm\" (UID: \"f111a23d-c6f2-4ca4-9434-aa20eafdf979\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.319088 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxf4z\" (UniqueName: \"kubernetes.io/projected/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-kube-api-access-cxf4z\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-z2q6p\" (UID: \"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.319107 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh4wm\" (UniqueName: \"kubernetes.io/projected/5f0943ff-7d84-467b-9d22-53fdacb1b054-kube-api-access-dh4wm\") pod \"octavia-operator-controller-manager-5b9f45d989-pwjdt\" (UID: \"5f0943ff-7d84-467b-9d22-53fdacb1b054\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.319163 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xt8jh" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.319247 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxmj\" (UniqueName: \"kubernetes.io/projected/0410685f-0ab6-43db-9831-f6cd0b0e7f6f-kube-api-access-6cxmj\") pod \"neutron-operator-controller-manager-767865f676-24cf9\" (UID: \"0410685f-0ab6-43db-9831-f6cd0b0e7f6f\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-24cf9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.319268 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-z2q6p\" (UID: \"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.319410 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.342917 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.362286 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlx48\" (UniqueName: \"kubernetes.io/projected/8a040614-ef32-4b0d-a5ae-a3336d26bc71-kube-api-access-jlx48\") pod \"mariadb-operator-controller-manager-67ccfc9778-sdqrr\" (UID: \"8a040614-ef32-4b0d-a5ae-a3336d26bc71\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.364808 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxmj\" (UniqueName: \"kubernetes.io/projected/0410685f-0ab6-43db-9831-f6cd0b0e7f6f-kube-api-access-6cxmj\") pod \"neutron-operator-controller-manager-767865f676-24cf9\" (UID: \"0410685f-0ab6-43db-9831-f6cd0b0e7f6f\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-24cf9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.368406 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh4wm\" (UniqueName: \"kubernetes.io/projected/5f0943ff-7d84-467b-9d22-53fdacb1b054-kube-api-access-dh4wm\") pod \"octavia-operator-controller-manager-5b9f45d989-pwjdt\" (UID: \"5f0943ff-7d84-467b-9d22-53fdacb1b054\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.370703 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w4v2\" (UniqueName: \"kubernetes.io/projected/f111a23d-c6f2-4ca4-9434-aa20eafdf979-kube-api-access-9w4v2\") pod \"nova-operator-controller-manager-5d488d59fb-7lxwm\" (UID: \"f111a23d-c6f2-4ca4-9434-aa20eafdf979\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.377016 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvsz8\" (UniqueName: \"kubernetes.io/projected/238654a5-1849-4f9f-9496-a8e796655b37-kube-api-access-pvsz8\") pod \"manila-operator-controller-manager-55f864c847-hkvw2\" (UID: \"238654a5-1849-4f9f-9496-a8e796655b37\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.377439 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.380851 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.386255 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.399089 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.399347 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.400496 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.405579 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-w7sz8" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.420940 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-z2q6p\" (UID: \"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.420989 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkl68\" (UniqueName: \"kubernetes.io/projected/5783d021-3258-4703-b4d8-0989af31ce65-kube-api-access-kkl68\") pod \"placement-operator-controller-manager-5784578c99-d2jjj\" (UID: \"5783d021-3258-4703-b4d8-0989af31ce65\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.421020 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2q4w\" (UniqueName: \"kubernetes.io/projected/0254dff8-49b6-49a6-a79b-c366bd0f247e-kube-api-access-j2q4w\") pod \"swift-operator-controller-manager-c674c5965-mqh5f\" (UID: \"0254dff8-49b6-49a6-a79b-c366bd0f247e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.421047 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj4fd\" (UniqueName: \"kubernetes.io/projected/73b3ae86-f483-4083-8eb1-9925cac6b796-kube-api-access-tj4fd\") pod \"ovn-operator-controller-manager-555bbdc4dc-c94kc\" (UID: \"73b3ae86-f483-4083-8eb1-9925cac6b796\") " pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.421071 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxf4z\" (UniqueName: \"kubernetes.io/projected/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-kube-api-access-cxf4z\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-z2q6p\" (UID: \"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:20 crc kubenswrapper[4764]: E0320 15:08:20.421367 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:08:20 crc kubenswrapper[4764]: E0320 15:08:20.421550 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert podName:6efa9b2a-452d-42b2-bb4c-fe8d41747d3f nodeName:}" failed. No retries permitted until 2026-03-20 15:08:20.921524742 +0000 UTC m=+1022.537713871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" (UID: "6efa9b2a-452d-42b2-bb4c-fe8d41747d3f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.421662 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-24cf9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.442733 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.468296 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.473637 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxf4z\" (UniqueName: \"kubernetes.io/projected/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-kube-api-access-cxf4z\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-z2q6p\" (UID: \"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.522295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm5lw\" (UniqueName: \"kubernetes.io/projected/d5b94bce-37f0-4816-bfa7-6947c258f201-kube-api-access-mm5lw\") pod \"telemetry-operator-controller-manager-d6b694c5-wpnxh\" (UID: \"d5b94bce-37f0-4816-bfa7-6947c258f201\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.522349 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkl68\" (UniqueName: \"kubernetes.io/projected/5783d021-3258-4703-b4d8-0989af31ce65-kube-api-access-kkl68\") pod \"placement-operator-controller-manager-5784578c99-d2jjj\" (UID: \"5783d021-3258-4703-b4d8-0989af31ce65\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.522390 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2q4w\" (UniqueName: \"kubernetes.io/projected/0254dff8-49b6-49a6-a79b-c366bd0f247e-kube-api-access-j2q4w\") pod \"swift-operator-controller-manager-c674c5965-mqh5f\" (UID: \"0254dff8-49b6-49a6-a79b-c366bd0f247e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.522420 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj4fd\" (UniqueName: \"kubernetes.io/projected/73b3ae86-f483-4083-8eb1-9925cac6b796-kube-api-access-tj4fd\") pod \"ovn-operator-controller-manager-555bbdc4dc-c94kc\" (UID: \"73b3ae86-f483-4083-8eb1-9925cac6b796\") " pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.540450 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.546003 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2q4w\" (UniqueName: \"kubernetes.io/projected/0254dff8-49b6-49a6-a79b-c366bd0f247e-kube-api-access-j2q4w\") pod \"swift-operator-controller-manager-c674c5965-mqh5f\" (UID: \"0254dff8-49b6-49a6-a79b-c366bd0f247e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.567855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkl68\" (UniqueName: \"kubernetes.io/projected/5783d021-3258-4703-b4d8-0989af31ce65-kube-api-access-kkl68\") pod \"placement-operator-controller-manager-5784578c99-d2jjj\" (UID: \"5783d021-3258-4703-b4d8-0989af31ce65\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.567897 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj4fd\" (UniqueName: \"kubernetes.io/projected/73b3ae86-f483-4083-8eb1-9925cac6b796-kube-api-access-tj4fd\") pod \"ovn-operator-controller-manager-555bbdc4dc-c94kc\" (UID: \"73b3ae86-f483-4083-8eb1-9925cac6b796\") " pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.573190 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.579600 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.579661 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.586714 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vl8hp" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.591574 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.624846 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm5lw\" (UniqueName: \"kubernetes.io/projected/d5b94bce-37f0-4816-bfa7-6947c258f201-kube-api-access-mm5lw\") pod \"telemetry-operator-controller-manager-d6b694c5-wpnxh\" (UID: \"d5b94bce-37f0-4816-bfa7-6947c258f201\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.625305 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.638126 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.640846 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.644650 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-jdgrs" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.658368 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm5lw\" (UniqueName: \"kubernetes.io/projected/d5b94bce-37f0-4816-bfa7-6947c258f201-kube-api-access-mm5lw\") pod \"telemetry-operator-controller-manager-d6b694c5-wpnxh\" (UID: \"d5b94bce-37f0-4816-bfa7-6947c258f201\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.660820 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.678108 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.693572 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.696089 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.698794 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.699426 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-b4xk7" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.699981 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.726036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert\") pod \"infra-operator-controller-manager-7b9c774f96-kj5l7\" (UID: \"0d25f0a7-3740-4f16-96f3-63b0f587f0a0\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.726078 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv2nh\" (UniqueName: \"kubernetes.io/projected/c9009e06-5771-43ec-a800-c79012d6c18e-kube-api-access-cv2nh\") pod \"test-operator-controller-manager-5c5cb9c4d7-skvs9\" (UID: \"c9009e06-5771-43ec-a800-c79012d6c18e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9" Mar 20 15:08:20 crc kubenswrapper[4764]: E0320 15:08:20.726202 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 15:08:20 crc kubenswrapper[4764]: E0320 15:08:20.726242 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert podName:0d25f0a7-3740-4f16-96f3-63b0f587f0a0 nodeName:}" failed. No retries permitted until 2026-03-20 15:08:21.72622848 +0000 UTC m=+1023.342417609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert") pod "infra-operator-controller-manager-7b9c774f96-kj5l7" (UID: "0d25f0a7-3740-4f16-96f3-63b0f587f0a0") : secret "infra-operator-webhook-server-cert" not found Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.734601 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.750770 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.785823 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.787959 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.792729 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-kz97z" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.795567 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.828524 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pv7l\" (UniqueName: \"kubernetes.io/projected/7232f0f7-0987-43fc-ab04-eaf226617757-kube-api-access-4pv7l\") pod \"watcher-operator-controller-manager-6c4d75f7f9-62lwt\" (UID: \"7232f0f7-0987-43fc-ab04-eaf226617757\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.828611 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.828670 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv2nh\" (UniqueName: \"kubernetes.io/projected/c9009e06-5771-43ec-a800-c79012d6c18e-kube-api-access-cv2nh\") pod \"test-operator-controller-manager-5c5cb9c4d7-skvs9\" (UID: \"c9009e06-5771-43ec-a800-c79012d6c18e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.828703 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.828741 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df5dm\" (UniqueName: \"kubernetes.io/projected/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-kube-api-access-df5dm\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.864793 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv2nh\" (UniqueName: \"kubernetes.io/projected/c9009e06-5771-43ec-a800-c79012d6c18e-kube-api-access-cv2nh\") pod \"test-operator-controller-manager-5c5cb9c4d7-skvs9\" (UID: \"c9009e06-5771-43ec-a800-c79012d6c18e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.876449 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.899827 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.910747 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.930098 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj7zp\" (UniqueName: \"kubernetes.io/projected/605fb18f-f047-4e83-bdf7-24556aec2ed8-kube-api-access-fj7zp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jjrw6\" (UID: \"605fb18f-f047-4e83-bdf7-24556aec2ed8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.930158 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pv7l\" (UniqueName: \"kubernetes.io/projected/7232f0f7-0987-43fc-ab04-eaf226617757-kube-api-access-4pv7l\") pod \"watcher-operator-controller-manager-6c4d75f7f9-62lwt\" (UID: \"7232f0f7-0987-43fc-ab04-eaf226617757\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.930222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.930320 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:20 crc kubenswrapper[4764]: E0320 15:08:20.930343 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.930358 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-z2q6p\" (UID: \"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:20 crc kubenswrapper[4764]: E0320 15:08:20.930420 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs podName:bce3e053-25d9-4eb2-933f-3e40a6ae89ab nodeName:}" failed. No retries permitted until 2026-03-20 15:08:21.430401914 +0000 UTC m=+1023.046591033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs") pod "openstack-operator-controller-manager-6f477c5b6b-tqfm2" (UID: "bce3e053-25d9-4eb2-933f-3e40a6ae89ab") : secret "webhook-server-cert" not found Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.930445 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df5dm\" (UniqueName: \"kubernetes.io/projected/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-kube-api-access-df5dm\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:20 crc kubenswrapper[4764]: E0320 15:08:20.930504 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:08:20 crc kubenswrapper[4764]: E0320 15:08:20.930557 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert podName:6efa9b2a-452d-42b2-bb4c-fe8d41747d3f nodeName:}" failed. No retries permitted until 2026-03-20 15:08:21.930539129 +0000 UTC m=+1023.546728338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" (UID: "6efa9b2a-452d-42b2-bb4c-fe8d41747d3f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:08:20 crc kubenswrapper[4764]: E0320 15:08:20.930665 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 15:08:20 crc kubenswrapper[4764]: E0320 15:08:20.930749 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs podName:bce3e053-25d9-4eb2-933f-3e40a6ae89ab nodeName:}" failed. No retries permitted until 2026-03-20 15:08:21.430738245 +0000 UTC m=+1023.046927374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs") pod "openstack-operator-controller-manager-6f477c5b6b-tqfm2" (UID: "bce3e053-25d9-4eb2-933f-3e40a6ae89ab") : secret "metrics-server-cert" not found Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.946276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pv7l\" (UniqueName: \"kubernetes.io/projected/7232f0f7-0987-43fc-ab04-eaf226617757-kube-api-access-4pv7l\") pod \"watcher-operator-controller-manager-6c4d75f7f9-62lwt\" (UID: \"7232f0f7-0987-43fc-ab04-eaf226617757\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.967397 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df5dm\" (UniqueName: \"kubernetes.io/projected/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-kube-api-access-df5dm\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.977823 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9"] Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.978188 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" Mar 20 15:08:20 crc kubenswrapper[4764]: I0320 15:08:20.994229 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq"] Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.031493 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj7zp\" (UniqueName: \"kubernetes.io/projected/605fb18f-f047-4e83-bdf7-24556aec2ed8-kube-api-access-fj7zp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jjrw6\" (UID: \"605fb18f-f047-4e83-bdf7-24556aec2ed8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6" Mar 20 15:08:21 crc kubenswrapper[4764]: W0320 15:08:21.031924 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b3a567_d460_4c6c_ba32_aaa43faf3add.slice/crio-a46e14d03f6c8706eed25e832f451c8b7089844eeb0dec971fda8db37f851bce WatchSource:0}: Error finding container a46e14d03f6c8706eed25e832f451c8b7089844eeb0dec971fda8db37f851bce: Status 404 returned error can't find the container with id a46e14d03f6c8706eed25e832f451c8b7089844eeb0dec971fda8db37f851bce Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.066847 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj7zp\" (UniqueName: \"kubernetes.io/projected/605fb18f-f047-4e83-bdf7-24556aec2ed8-kube-api-access-fj7zp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jjrw6\" (UID: \"605fb18f-f047-4e83-bdf7-24556aec2ed8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6" Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.166622 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6" Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.174078 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6"] Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.179360 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4"] Mar 20 15:08:21 crc kubenswrapper[4764]: W0320 15:08:21.227232 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8895943a_8d4d_471a_a0db_44eb2e832119.slice/crio-2d69e22b14afaabd54e23154158624e5154c725f4bacf99180c10be120a4b867 WatchSource:0}: Error finding container 2d69e22b14afaabd54e23154158624e5154c725f4bacf99180c10be120a4b867: Status 404 returned error can't find the container with id 2d69e22b14afaabd54e23154158624e5154c725f4bacf99180c10be120a4b867 Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.437579 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.437670 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.437815 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.437863 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs podName:bce3e053-25d9-4eb2-933f-3e40a6ae89ab nodeName:}" failed. No retries permitted until 2026-03-20 15:08:22.437847944 +0000 UTC m=+1024.054037063 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs") pod "openstack-operator-controller-manager-6f477c5b6b-tqfm2" (UID: "bce3e053-25d9-4eb2-933f-3e40a6ae89ab") : secret "metrics-server-cert" not found Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.438167 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.438196 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs podName:bce3e053-25d9-4eb2-933f-3e40a6ae89ab nodeName:}" failed. No retries permitted until 2026-03-20 15:08:22.438188824 +0000 UTC m=+1024.054377953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs") pod "openstack-operator-controller-manager-6f477c5b6b-tqfm2" (UID: "bce3e053-25d9-4eb2-933f-3e40a6ae89ab") : secret "webhook-server-cert" not found Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.470719 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8"] Mar 20 15:08:21 crc kubenswrapper[4764]: W0320 15:08:21.479960 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf38f9570_e767_488b_b3d1_97da8b4afa56.slice/crio-89da30cc50b70d3e263a2df59bc0055cbc9453d37086c1aa4e34fe6943408725 WatchSource:0}: Error finding container 89da30cc50b70d3e263a2df59bc0055cbc9453d37086c1aa4e34fe6943408725: Status 404 returned error can't find the container with id 89da30cc50b70d3e263a2df59bc0055cbc9453d37086c1aa4e34fe6943408725 Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.507095 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-24cf9"] Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.511628 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2"] Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.527975 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr"] Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.539213 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm"] Mar 20 15:08:21 crc kubenswrapper[4764]: W0320 15:08:21.541751 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf111a23d_c6f2_4ca4_9434_aa20eafdf979.slice/crio-430698312dd8e48f1a85b070480b0fd9f0c53ffd8caaa78528270b52dd9a8331 WatchSource:0}: Error finding container 430698312dd8e48f1a85b070480b0fd9f0c53ffd8caaa78528270b52dd9a8331: Status 404 returned error can't find the container with id 430698312dd8e48f1a85b070480b0fd9f0c53ffd8caaa78528270b52dd9a8331 Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.593192 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv"] Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.599873 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh"] Mar 20 15:08:21 crc kubenswrapper[4764]: W0320 15:08:21.606612 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5b94bce_37f0_4816_bfa7_6947c258f201.slice/crio-ebcb1b10cd9ec7f275666ced1c1f4e2b82c3f0413bf79deb3d50da7e29c6b542 WatchSource:0}: Error finding container ebcb1b10cd9ec7f275666ced1c1f4e2b82c3f0413bf79deb3d50da7e29c6b542: Status 404 returned error can't find the container with id ebcb1b10cd9ec7f275666ced1c1f4e2b82c3f0413bf79deb3d50da7e29c6b542 Mar 20 15:08:21 crc kubenswrapper[4764]: W0320 15:08:21.614497 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8815a47_3a15_4fcb_a8eb_c72f767b30f0.slice/crio-7f639a36e878aa1a553de3bfe7b141543e384a6617999cac9668fb3fdfd5a9cd WatchSource:0}: Error finding container 7f639a36e878aa1a553de3bfe7b141543e384a6617999cac9668fb3fdfd5a9cd: Status 404 returned error can't find the container with id 7f639a36e878aa1a553de3bfe7b141543e384a6617999cac9668fb3fdfd5a9cd Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.694971 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6" event={"ID":"46418e4e-189e-4cc1-8df8-343f53697f68","Type":"ContainerStarted","Data":"5e956337af14ed4544caf805bd5abad5f9ed8fb534bcd61bd35cf86d1b18d286"} Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.696357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8" event={"ID":"f38f9570-e767-488b-b3d1-97da8b4afa56","Type":"ContainerStarted","Data":"89da30cc50b70d3e263a2df59bc0055cbc9453d37086c1aa4e34fe6943408725"} Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.701750 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt"] Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.702997 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9" event={"ID":"90b3a567-d460-4c6c-ba32-aaa43faf3add","Type":"ContainerStarted","Data":"a46e14d03f6c8706eed25e832f451c8b7089844eeb0dec971fda8db37f851bce"} Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.704151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-24cf9" event={"ID":"0410685f-0ab6-43db-9831-f6cd0b0e7f6f","Type":"ContainerStarted","Data":"3936307f9595fca1e41ba8c4d9168e6bc89b4cde81d2d75f9133140b69a697ff"} Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.708023 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv" event={"ID":"c8815a47-3a15-4fcb-a8eb-c72f767b30f0","Type":"ContainerStarted","Data":"7f639a36e878aa1a553de3bfe7b141543e384a6617999cac9668fb3fdfd5a9cd"} Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.709242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6" event={"ID":"3b81c755-123b-4caa-bc28-43ce8b672547","Type":"ContainerStarted","Data":"dce3145a9ff17e6c1e050c7e0cc292dc0ac586d9bcee59332b0de9144b9cfd03"} Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.710222 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4" event={"ID":"8895943a-8d4d-471a-a0db-44eb2e832119","Type":"ContainerStarted","Data":"2d69e22b14afaabd54e23154158624e5154c725f4bacf99180c10be120a4b867"} Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.711337 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm" event={"ID":"f111a23d-c6f2-4ca4-9434-aa20eafdf979","Type":"ContainerStarted","Data":"430698312dd8e48f1a85b070480b0fd9f0c53ffd8caaa78528270b52dd9a8331"} Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.716936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq" event={"ID":"7936786e-744e-4fc9-bdd2-3e9ee6c7d3bf","Type":"ContainerStarted","Data":"cc428115f7fd5e14786faa49431ce633fcf5c3f9d49009d9bb3c5718203ba13c"} Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.717951 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4pv7l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-62lwt_openstack-operators(7232f0f7-0987-43fc-ab04-eaf226617757): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.717959 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.65:5001/openstack-k8s-operators/ovn-operator:bbf8293a07c1a79cb0d7dad30c44a6e85edba63b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tj4fd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-555bbdc4dc-c94kc_openstack-operators(73b3ae86-f483-4083-8eb1-9925cac6b796): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.718876 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr" event={"ID":"8a040614-ef32-4b0d-a5ae-a3336d26bc71","Type":"ContainerStarted","Data":"9b1fee00d02463f92173653fdab546f3205cf7278404e3e089da08a93f53f1f1"} Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.719010 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" podUID="73b3ae86-f483-4083-8eb1-9925cac6b796" Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.719039 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" podUID="7232f0f7-0987-43fc-ab04-eaf226617757" Mar 20 15:08:21 crc kubenswrapper[4764]: W0320 15:08:21.719828 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0254dff8_49b6_49a6_a79b_c366bd0f247e.slice/crio-b84a0306570d7070b940cfaba1913c4da2d9e7a8a047bcb39d0977be4a6fb9b4 WatchSource:0}: Error finding container b84a0306570d7070b940cfaba1913c4da2d9e7a8a047bcb39d0977be4a6fb9b4: Status 404 returned error can't find the container with id b84a0306570d7070b940cfaba1913c4da2d9e7a8a047bcb39d0977be4a6fb9b4 Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.720532 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm" event={"ID":"639f5d85-78ac-409d-b2ed-b809cb59bfc5","Type":"ContainerStarted","Data":"cdff815ebc5c88f608be391ef61f00159d5047d027e585d68e30bc5876f07c67"} Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.721272 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2" event={"ID":"238654a5-1849-4f9f-9496-a8e796655b37","Type":"ContainerStarted","Data":"d726b1d6bf95a6f82166c46be44831d78c54217572d9055c834acecd0e2d08c4"} Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.722239 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j2q4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-mqh5f_openstack-operators(0254dff8-49b6-49a6-a79b-c366bd0f247e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.722754 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh" event={"ID":"d5b94bce-37f0-4816-bfa7-6947c258f201","Type":"ContainerStarted","Data":"ebcb1b10cd9ec7f275666ced1c1f4e2b82c3f0413bf79deb3d50da7e29c6b542"} Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.723394 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" podUID="0254dff8-49b6-49a6-a79b-c366bd0f247e" Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.732007 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj"] Mar 20 15:08:21 crc kubenswrapper[4764]: W0320 15:08:21.735677 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f0943ff_7d84_467b_9d22_53fdacb1b054.slice/crio-f773c5f3e69ce5e71d940cf1099e0d65c4c71d1b0e5f9f4483aa2173a78d0404 WatchSource:0}: Error finding container f773c5f3e69ce5e71d940cf1099e0d65c4c71d1b0e5f9f4483aa2173a78d0404: Status 404 returned error can't find the container with id f773c5f3e69ce5e71d940cf1099e0d65c4c71d1b0e5f9f4483aa2173a78d0404 Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.738833 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dh4wm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-pwjdt_openstack-operators(5f0943ff-7d84-467b-9d22-53fdacb1b054): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.740169 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" podUID="5f0943ff-7d84-467b-9d22-53fdacb1b054" Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.740888 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert\") pod \"infra-operator-controller-manager-7b9c774f96-kj5l7\" (UID: \"0d25f0a7-3740-4f16-96f3-63b0f587f0a0\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.741035 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.741070 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert podName:0d25f0a7-3740-4f16-96f3-63b0f587f0a0 nodeName:}" failed. No retries permitted until 2026-03-20 15:08:23.741059307 +0000 UTC m=+1025.357248436 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert") pod "infra-operator-controller-manager-7b9c774f96-kj5l7" (UID: "0d25f0a7-3740-4f16-96f3-63b0f587f0a0") : secret "infra-operator-webhook-server-cert" not found Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.746646 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt"] Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.759970 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc"] Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.764159 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f"] Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.864487 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9"] Mar 20 15:08:21 crc kubenswrapper[4764]: W0320 15:08:21.873089 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9009e06_5771_43ec_a800_c79012d6c18e.slice/crio-b1f88407d20451ea0f393e5d0be9e96d78605d777e9ededdb1ace3a70d7b5261 WatchSource:0}: Error finding container b1f88407d20451ea0f393e5d0be9e96d78605d777e9ededdb1ace3a70d7b5261: Status 404 returned error can't find the container with id b1f88407d20451ea0f393e5d0be9e96d78605d777e9ededdb1ace3a70d7b5261 Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.945069 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-z2q6p\" (UID: \"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.945275 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.945330 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert podName:6efa9b2a-452d-42b2-bb4c-fe8d41747d3f nodeName:}" failed. No retries permitted until 2026-03-20 15:08:23.945310623 +0000 UTC m=+1025.561499752 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" (UID: "6efa9b2a-452d-42b2-bb4c-fe8d41747d3f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:08:21 crc kubenswrapper[4764]: I0320 15:08:21.955889 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6"] Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.962484 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fj7zp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-jjrw6_openstack-operators(605fb18f-f047-4e83-bdf7-24556aec2ed8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 15:08:21 crc kubenswrapper[4764]: E0320 15:08:21.963554 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6" podUID="605fb18f-f047-4e83-bdf7-24556aec2ed8" Mar 20 15:08:22 crc kubenswrapper[4764]: I0320 15:08:22.451549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:22 crc kubenswrapper[4764]: I0320 15:08:22.451653 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:22 crc kubenswrapper[4764]: E0320 15:08:22.451755 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 15:08:22 crc kubenswrapper[4764]: E0320 15:08:22.451838 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs podName:bce3e053-25d9-4eb2-933f-3e40a6ae89ab nodeName:}" failed. No retries permitted until 2026-03-20 15:08:24.451818214 +0000 UTC m=+1026.068007363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs") pod "openstack-operator-controller-manager-6f477c5b6b-tqfm2" (UID: "bce3e053-25d9-4eb2-933f-3e40a6ae89ab") : secret "webhook-server-cert" not found Mar 20 15:08:22 crc kubenswrapper[4764]: E0320 15:08:22.451941 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 15:08:22 crc kubenswrapper[4764]: E0320 15:08:22.452048 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs podName:bce3e053-25d9-4eb2-933f-3e40a6ae89ab nodeName:}" failed. No retries permitted until 2026-03-20 15:08:24.45201915 +0000 UTC m=+1026.068208319 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs") pod "openstack-operator-controller-manager-6f477c5b6b-tqfm2" (UID: "bce3e053-25d9-4eb2-933f-3e40a6ae89ab") : secret "metrics-server-cert" not found Mar 20 15:08:22 crc kubenswrapper[4764]: I0320 15:08:22.763549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" event={"ID":"5f0943ff-7d84-467b-9d22-53fdacb1b054","Type":"ContainerStarted","Data":"f773c5f3e69ce5e71d940cf1099e0d65c4c71d1b0e5f9f4483aa2173a78d0404"} Mar 20 15:08:22 crc kubenswrapper[4764]: E0320 15:08:22.765466 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" podUID="5f0943ff-7d84-467b-9d22-53fdacb1b054" Mar 20 15:08:22 crc kubenswrapper[4764]: I0320 15:08:22.785605 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" event={"ID":"73b3ae86-f483-4083-8eb1-9925cac6b796","Type":"ContainerStarted","Data":"b3b9045089428ededc68fd86e55c0699acc37024e8fde9abfb556ade4080b63c"} Mar 20 15:08:22 crc kubenswrapper[4764]: I0320 15:08:22.789161 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" event={"ID":"7232f0f7-0987-43fc-ab04-eaf226617757","Type":"ContainerStarted","Data":"9546269e2a7ac0f4c74e8cca2372251670704c3e52169be679a99c4d9e60859c"} Mar 20 15:08:22 crc kubenswrapper[4764]: E0320 15:08:22.791679 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" podUID="7232f0f7-0987-43fc-ab04-eaf226617757" Mar 20 15:08:22 crc kubenswrapper[4764]: E0320 15:08:22.796187 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.65:5001/openstack-k8s-operators/ovn-operator:bbf8293a07c1a79cb0d7dad30c44a6e85edba63b\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" podUID="73b3ae86-f483-4083-8eb1-9925cac6b796" Mar 20 15:08:22 crc kubenswrapper[4764]: I0320 15:08:22.800131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" event={"ID":"0254dff8-49b6-49a6-a79b-c366bd0f247e","Type":"ContainerStarted","Data":"b84a0306570d7070b940cfaba1913c4da2d9e7a8a047bcb39d0977be4a6fb9b4"} Mar 20 15:08:22 crc kubenswrapper[4764]: I0320 15:08:22.811418 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj" event={"ID":"5783d021-3258-4703-b4d8-0989af31ce65","Type":"ContainerStarted","Data":"1dd9df3c2a3bd3cef6a6a80f6b84b865c73b9526c2209c266c0677b5e64b2cd7"} Mar 20 15:08:22 crc kubenswrapper[4764]: I0320 15:08:22.815988 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6" event={"ID":"605fb18f-f047-4e83-bdf7-24556aec2ed8","Type":"ContainerStarted","Data":"91d63a5668d9e1ca0ebd13a829842f597f77249b878bf6423aebb4850ac97e73"} Mar 20 15:08:22 crc kubenswrapper[4764]: E0320 15:08:22.816936 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" podUID="0254dff8-49b6-49a6-a79b-c366bd0f247e" Mar 20 15:08:22 crc kubenswrapper[4764]: E0320 15:08:22.817509 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6" podUID="605fb18f-f047-4e83-bdf7-24556aec2ed8" Mar 20 15:08:22 crc kubenswrapper[4764]: I0320 15:08:22.828224 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9" event={"ID":"c9009e06-5771-43ec-a800-c79012d6c18e","Type":"ContainerStarted","Data":"b1f88407d20451ea0f393e5d0be9e96d78605d777e9ededdb1ace3a70d7b5261"} Mar 20 15:08:23 crc kubenswrapper[4764]: I0320 15:08:23.771958 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert\") pod \"infra-operator-controller-manager-7b9c774f96-kj5l7\" (UID: \"0d25f0a7-3740-4f16-96f3-63b0f587f0a0\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:23 crc kubenswrapper[4764]: E0320 15:08:23.772145 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 15:08:23 crc kubenswrapper[4764]: E0320 15:08:23.772216 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert podName:0d25f0a7-3740-4f16-96f3-63b0f587f0a0 nodeName:}" failed. No retries permitted until 2026-03-20 15:08:27.772198374 +0000 UTC m=+1029.388387503 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert") pod "infra-operator-controller-manager-7b9c774f96-kj5l7" (UID: "0d25f0a7-3740-4f16-96f3-63b0f587f0a0") : secret "infra-operator-webhook-server-cert" not found Mar 20 15:08:23 crc kubenswrapper[4764]: E0320 15:08:23.847568 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" podUID="0254dff8-49b6-49a6-a79b-c366bd0f247e" Mar 20 15:08:23 crc kubenswrapper[4764]: E0320 15:08:23.847655 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.65:5001/openstack-k8s-operators/ovn-operator:bbf8293a07c1a79cb0d7dad30c44a6e85edba63b\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" podUID="73b3ae86-f483-4083-8eb1-9925cac6b796" Mar 20 15:08:23 crc kubenswrapper[4764]: E0320 15:08:23.847779 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" podUID="5f0943ff-7d84-467b-9d22-53fdacb1b054" Mar 20 15:08:23 crc kubenswrapper[4764]: E0320 15:08:23.847829 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" podUID="7232f0f7-0987-43fc-ab04-eaf226617757" Mar 20 15:08:23 crc kubenswrapper[4764]: E0320 15:08:23.847884 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6" podUID="605fb18f-f047-4e83-bdf7-24556aec2ed8" Mar 20 15:08:23 crc kubenswrapper[4764]: I0320 15:08:23.975538 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-z2q6p\" (UID: \"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:23 crc kubenswrapper[4764]: E0320 15:08:23.975676 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:08:23 crc kubenswrapper[4764]: E0320 15:08:23.975721 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert podName:6efa9b2a-452d-42b2-bb4c-fe8d41747d3f nodeName:}" failed. No retries permitted until 2026-03-20 15:08:27.975708097 +0000 UTC m=+1029.591897226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" (UID: "6efa9b2a-452d-42b2-bb4c-fe8d41747d3f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:08:24 crc kubenswrapper[4764]: I0320 15:08:24.484541 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:24 crc kubenswrapper[4764]: I0320 15:08:24.484856 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:24 crc kubenswrapper[4764]: E0320 15:08:24.484694 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 15:08:24 crc kubenswrapper[4764]: E0320 15:08:24.484931 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs podName:bce3e053-25d9-4eb2-933f-3e40a6ae89ab nodeName:}" failed. No retries permitted until 2026-03-20 15:08:28.484912461 +0000 UTC m=+1030.101101580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs") pod "openstack-operator-controller-manager-6f477c5b6b-tqfm2" (UID: "bce3e053-25d9-4eb2-933f-3e40a6ae89ab") : secret "metrics-server-cert" not found Mar 20 15:08:24 crc kubenswrapper[4764]: E0320 15:08:24.484974 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 15:08:24 crc kubenswrapper[4764]: E0320 15:08:24.485006 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs podName:bce3e053-25d9-4eb2-933f-3e40a6ae89ab nodeName:}" failed. No retries permitted until 2026-03-20 15:08:28.484998213 +0000 UTC m=+1030.101187332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs") pod "openstack-operator-controller-manager-6f477c5b6b-tqfm2" (UID: "bce3e053-25d9-4eb2-933f-3e40a6ae89ab") : secret "webhook-server-cert" not found Mar 20 15:08:27 crc kubenswrapper[4764]: I0320 15:08:27.835166 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert\") pod \"infra-operator-controller-manager-7b9c774f96-kj5l7\" (UID: \"0d25f0a7-3740-4f16-96f3-63b0f587f0a0\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:27 crc kubenswrapper[4764]: E0320 15:08:27.835353 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 15:08:27 crc kubenswrapper[4764]: E0320 15:08:27.835653 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert podName:0d25f0a7-3740-4f16-96f3-63b0f587f0a0 nodeName:}" failed. No retries permitted until 2026-03-20 15:08:35.835634434 +0000 UTC m=+1037.451823563 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert") pod "infra-operator-controller-manager-7b9c774f96-kj5l7" (UID: "0d25f0a7-3740-4f16-96f3-63b0f587f0a0") : secret "infra-operator-webhook-server-cert" not found Mar 20 15:08:28 crc kubenswrapper[4764]: I0320 15:08:28.038891 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-z2q6p\" (UID: \"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:28 crc kubenswrapper[4764]: E0320 15:08:28.039083 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:08:28 crc kubenswrapper[4764]: E0320 15:08:28.039162 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert podName:6efa9b2a-452d-42b2-bb4c-fe8d41747d3f nodeName:}" failed. No retries permitted until 2026-03-20 15:08:36.039142197 +0000 UTC m=+1037.655331326 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" (UID: "6efa9b2a-452d-42b2-bb4c-fe8d41747d3f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:08:28 crc kubenswrapper[4764]: I0320 15:08:28.547141 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:28 crc kubenswrapper[4764]: I0320 15:08:28.547245 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:28 crc kubenswrapper[4764]: E0320 15:08:28.547400 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 15:08:28 crc kubenswrapper[4764]: E0320 15:08:28.547443 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs podName:bce3e053-25d9-4eb2-933f-3e40a6ae89ab nodeName:}" failed. No retries permitted until 2026-03-20 15:08:36.547430272 +0000 UTC m=+1038.163619401 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs") pod "openstack-operator-controller-manager-6f477c5b6b-tqfm2" (UID: "bce3e053-25d9-4eb2-933f-3e40a6ae89ab") : secret "metrics-server-cert" not found Mar 20 15:08:28 crc kubenswrapper[4764]: E0320 15:08:28.547733 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 15:08:28 crc kubenswrapper[4764]: E0320 15:08:28.547763 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs podName:bce3e053-25d9-4eb2-933f-3e40a6ae89ab nodeName:}" failed. No retries permitted until 2026-03-20 15:08:36.547755022 +0000 UTC m=+1038.163944151 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs") pod "openstack-operator-controller-manager-6f477c5b6b-tqfm2" (UID: "bce3e053-25d9-4eb2-933f-3e40a6ae89ab") : secret "webhook-server-cert" not found Mar 20 15:08:35 crc kubenswrapper[4764]: I0320 15:08:35.627567 4764 scope.go:117] "RemoveContainer" containerID="3a9022cc51e1ed5e9a1edc09e9e71e0223714085cfdf0ffea69a9bc48e0dfb0d" Mar 20 15:08:35 crc kubenswrapper[4764]: I0320 15:08:35.857709 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert\") pod \"infra-operator-controller-manager-7b9c774f96-kj5l7\" (UID: \"0d25f0a7-3740-4f16-96f3-63b0f587f0a0\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:35 crc kubenswrapper[4764]: E0320 15:08:35.857826 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 15:08:35 crc kubenswrapper[4764]: E0320 15:08:35.857912 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert podName:0d25f0a7-3740-4f16-96f3-63b0f587f0a0 nodeName:}" failed. No retries permitted until 2026-03-20 15:08:51.857893494 +0000 UTC m=+1053.474082633 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert") pod "infra-operator-controller-manager-7b9c774f96-kj5l7" (UID: "0d25f0a7-3740-4f16-96f3-63b0f587f0a0") : secret "infra-operator-webhook-server-cert" not found Mar 20 15:08:36 crc kubenswrapper[4764]: I0320 15:08:36.061480 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-z2q6p\" (UID: \"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:36 crc kubenswrapper[4764]: E0320 15:08:36.061655 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:08:36 crc kubenswrapper[4764]: E0320 15:08:36.061715 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert podName:6efa9b2a-452d-42b2-bb4c-fe8d41747d3f nodeName:}" failed. No retries permitted until 2026-03-20 15:08:52.061699766 +0000 UTC m=+1053.677888905 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" (UID: "6efa9b2a-452d-42b2-bb4c-fe8d41747d3f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 15:08:36 crc kubenswrapper[4764]: I0320 15:08:36.568705 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:36 crc kubenswrapper[4764]: I0320 15:08:36.568862 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:36 crc kubenswrapper[4764]: E0320 15:08:36.568993 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 15:08:36 crc kubenswrapper[4764]: E0320 15:08:36.569032 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 15:08:36 crc kubenswrapper[4764]: E0320 15:08:36.569125 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs podName:bce3e053-25d9-4eb2-933f-3e40a6ae89ab nodeName:}" failed. No retries permitted until 2026-03-20 15:08:52.569082104 +0000 UTC m=+1054.185271263 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs") pod "openstack-operator-controller-manager-6f477c5b6b-tqfm2" (UID: "bce3e053-25d9-4eb2-933f-3e40a6ae89ab") : secret "webhook-server-cert" not found Mar 20 15:08:36 crc kubenswrapper[4764]: E0320 15:08:36.569152 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs podName:bce3e053-25d9-4eb2-933f-3e40a6ae89ab nodeName:}" failed. No retries permitted until 2026-03-20 15:08:52.569140815 +0000 UTC m=+1054.185329974 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs") pod "openstack-operator-controller-manager-6f477c5b6b-tqfm2" (UID: "bce3e053-25d9-4eb2-933f-3e40a6ae89ab") : secret "metrics-server-cert" not found Mar 20 15:08:38 crc kubenswrapper[4764]: E0320 15:08:38.671429 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113" Mar 20 15:08:38 crc kubenswrapper[4764]: E0320 15:08:38.671805 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-49q5f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-8464cc45fb-bh9n8_openstack-operators(f38f9570-e767-488b-b3d1-97da8b4afa56): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:08:38 crc kubenswrapper[4764]: E0320 15:08:38.673001 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8" podUID="f38f9570-e767-488b-b3d1-97da8b4afa56" Mar 20 15:08:38 crc kubenswrapper[4764]: E0320 15:08:38.960544 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8" podUID="f38f9570-e767-488b-b3d1-97da8b4afa56" Mar 20 15:08:48 crc kubenswrapper[4764]: E0320 15:08:48.166900 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444" Mar 20 15:08:48 crc kubenswrapper[4764]: E0320 15:08:48.167810 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mm5lw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-wpnxh_openstack-operators(d5b94bce-37f0-4816-bfa7-6947c258f201): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:08:48 crc kubenswrapper[4764]: E0320 15:08:48.169095 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh" podUID="d5b94bce-37f0-4816-bfa7-6947c258f201" Mar 20 15:08:49 crc kubenswrapper[4764]: E0320 15:08:49.041802 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh" podUID="d5b94bce-37f0-4816-bfa7-6947c258f201" Mar 20 15:08:49 crc kubenswrapper[4764]: E0320 15:08:49.928103 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622" Mar 20 15:08:49 crc kubenswrapper[4764]: E0320 15:08:49.928682 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kkl68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-d2jjj_openstack-operators(5783d021-3258-4703-b4d8-0989af31ce65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:08:49 crc kubenswrapper[4764]: E0320 15:08:49.929803 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj" podUID="5783d021-3258-4703-b4d8-0989af31ce65" Mar 20 15:08:50 crc kubenswrapper[4764]: E0320 15:08:50.173842 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj" podUID="5783d021-3258-4703-b4d8-0989af31ce65" Mar 20 15:08:51 crc kubenswrapper[4764]: I0320 15:08:51.902737 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert\") pod \"infra-operator-controller-manager-7b9c774f96-kj5l7\" (UID: \"0d25f0a7-3740-4f16-96f3-63b0f587f0a0\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:51 crc kubenswrapper[4764]: I0320 15:08:51.911997 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d25f0a7-3740-4f16-96f3-63b0f587f0a0-cert\") pod \"infra-operator-controller-manager-7b9c774f96-kj5l7\" (UID: \"0d25f0a7-3740-4f16-96f3-63b0f587f0a0\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:51 crc kubenswrapper[4764]: E0320 15:08:51.925590 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 20 15:08:51 crc kubenswrapper[4764]: E0320 15:08:51.925777 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9w4v2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-7lxwm_openstack-operators(f111a23d-c6f2-4ca4-9434-aa20eafdf979): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:08:51 crc kubenswrapper[4764]: E0320 15:08:51.927898 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm" podUID="f111a23d-c6f2-4ca4-9434-aa20eafdf979" Mar 20 15:08:52 crc kubenswrapper[4764]: E0320 15:08:52.054304 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm" podUID="f111a23d-c6f2-4ca4-9434-aa20eafdf979" Mar 20 15:08:52 crc kubenswrapper[4764]: I0320 15:08:52.102634 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:08:52 crc kubenswrapper[4764]: I0320 15:08:52.105433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-z2q6p\" (UID: \"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:52 crc kubenswrapper[4764]: I0320 15:08:52.118224 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6efa9b2a-452d-42b2-bb4c-fe8d41747d3f-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-z2q6p\" (UID: \"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:52 crc kubenswrapper[4764]: I0320 15:08:52.357662 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:08:52 crc kubenswrapper[4764]: I0320 15:08:52.614552 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:52 crc kubenswrapper[4764]: I0320 15:08:52.614687 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:52 crc kubenswrapper[4764]: I0320 15:08:52.620707 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-webhook-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:52 crc kubenswrapper[4764]: I0320 15:08:52.627795 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bce3e053-25d9-4eb2-933f-3e40a6ae89ab-metrics-certs\") pod \"openstack-operator-controller-manager-6f477c5b6b-tqfm2\" (UID: \"bce3e053-25d9-4eb2-933f-3e40a6ae89ab\") " pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:52 crc kubenswrapper[4764]: E0320 15:08:52.716423 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 15:08:52 crc kubenswrapper[4764]: E0320 15:08:52.716572 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7cv7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-dh2nv_openstack-operators(c8815a47-3a15-4fcb-a8eb-c72f767b30f0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:08:52 crc kubenswrapper[4764]: E0320 15:08:52.717767 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv" podUID="c8815a47-3a15-4fcb-a8eb-c72f767b30f0" Mar 20 15:08:52 crc kubenswrapper[4764]: I0320 15:08:52.830208 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:08:53 crc kubenswrapper[4764]: E0320 15:08:53.062437 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv" podUID="c8815a47-3a15-4fcb-a8eb-c72f767b30f0" Mar 20 15:09:00 crc kubenswrapper[4764]: I0320 15:09:00.297611 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7"] Mar 20 15:09:00 crc kubenswrapper[4764]: I0320 15:09:00.341724 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p"] Mar 20 15:09:00 crc kubenswrapper[4764]: W0320 15:09:00.370595 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6efa9b2a_452d_42b2_bb4c_fe8d41747d3f.slice/crio-97aece7e9948c2a917fc78e00c4f6e64cc778a22d126fc38c3d035fd39167e37 WatchSource:0}: Error finding container 97aece7e9948c2a917fc78e00c4f6e64cc778a22d126fc38c3d035fd39167e37: Status 404 returned error can't find the container with id 97aece7e9948c2a917fc78e00c4f6e64cc778a22d126fc38c3d035fd39167e37 Mar 20 15:09:00 crc kubenswrapper[4764]: I0320 15:09:00.448774 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2"] Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.162021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-24cf9" event={"ID":"0410685f-0ab6-43db-9831-f6cd0b0e7f6f","Type":"ContainerStarted","Data":"765c97b7b7130a79e0314bfa9e43053c200249a18ff62e16cee8e437fe155f0a"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.162422 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-24cf9" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.171280 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" event={"ID":"7232f0f7-0987-43fc-ab04-eaf226617757","Type":"ContainerStarted","Data":"941b3b9e27f74d7488da90047b44eb114b7687227de029c3816b12092c0be935"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.171528 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.173277 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" event={"ID":"0254dff8-49b6-49a6-a79b-c366bd0f247e","Type":"ContainerStarted","Data":"2c3a0ff7d1d4cc24e630ade9e6e5a2b6ba1420e1dbec24a043df8caf7d500c8a"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.173423 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.179752 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-24cf9" podStartSLOduration=9.45961144 podStartE2EDuration="41.179736207s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.507101108 +0000 UTC m=+1023.123290227" lastFinishedPulling="2026-03-20 15:08:53.227225835 +0000 UTC m=+1054.843414994" observedRunningTime="2026-03-20 15:09:01.178210131 +0000 UTC m=+1062.794399280" watchObservedRunningTime="2026-03-20 15:09:01.179736207 +0000 UTC m=+1062.795925336" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.185128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" event={"ID":"bce3e053-25d9-4eb2-933f-3e40a6ae89ab","Type":"ContainerStarted","Data":"1788b760b48a1349df9987e689a10912d0f915db0dcb23a5dfc2cfc91ee65c87"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.185165 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" event={"ID":"bce3e053-25d9-4eb2-933f-3e40a6ae89ab","Type":"ContainerStarted","Data":"ff21a4876cdb05480f8b9649543dcdee91539b40fc1b674f913b30d0fe5a0545"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.185298 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.204123 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2" event={"ID":"238654a5-1849-4f9f-9496-a8e796655b37","Type":"ContainerStarted","Data":"19b451795d4758c7182d4ef1d925fde0adc7cbf6208e0e5131cef50ce6503f60"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.204254 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.212648 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" podStartSLOduration=3.564800284 podStartE2EDuration="41.212603309s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.71780208 +0000 UTC m=+1023.333991209" lastFinishedPulling="2026-03-20 15:08:59.365605095 +0000 UTC m=+1060.981794234" observedRunningTime="2026-03-20 15:09:01.209540227 +0000 UTC m=+1062.825729356" watchObservedRunningTime="2026-03-20 15:09:01.212603309 +0000 UTC m=+1062.828792438" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.216691 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" event={"ID":"0d25f0a7-3740-4f16-96f3-63b0f587f0a0","Type":"ContainerStarted","Data":"a744f5d5d67ade7686487560b38bf8a1b5b2079c132c4c3b73cdc1524e803aae"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.238208 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9" event={"ID":"c9009e06-5771-43ec-a800-c79012d6c18e","Type":"ContainerStarted","Data":"fbd1270578330fd47196e264669df5ce6b284633524619be3a615bed132d66bb"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.238304 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.241710 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq" event={"ID":"7936786e-744e-4fc9-bdd2-3e9ee6c7d3bf","Type":"ContainerStarted","Data":"ec281ff4f9567f6227805d614f788eff9641383f45ff195d47a707c1a635c900"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.241870 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.251684 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" event={"ID":"5f0943ff-7d84-467b-9d22-53fdacb1b054","Type":"ContainerStarted","Data":"40122606706e64a761f618735fbac4af9ac934b647e57c4a976a2775be9525b3"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.251958 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.260626 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4" event={"ID":"8895943a-8d4d-471a-a0db-44eb2e832119","Type":"ContainerStarted","Data":"b1114b0f5f071ce719ffcebc8940fff5b65f7597d8db8ee468ba1c5a922921d9"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.260808 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.268702 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6" event={"ID":"605fb18f-f047-4e83-bdf7-24556aec2ed8","Type":"ContainerStarted","Data":"ce78562d53c552867c0cdd99405c8df70854545adb1a05af5ed4f45a58d61d29"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.273573 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" event={"ID":"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f","Type":"ContainerStarted","Data":"97aece7e9948c2a917fc78e00c4f6e64cc778a22d126fc38c3d035fd39167e37"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.276318 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" podStartSLOduration=4.09340207 podStartE2EDuration="41.276302863s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.722152822 +0000 UTC m=+1023.338341951" lastFinishedPulling="2026-03-20 15:08:58.905053615 +0000 UTC m=+1060.521242744" observedRunningTime="2026-03-20 15:09:01.276092317 +0000 UTC m=+1062.892281446" watchObservedRunningTime="2026-03-20 15:09:01.276302863 +0000 UTC m=+1062.892491992" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.291225 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9" event={"ID":"90b3a567-d460-4c6c-ba32-aaa43faf3add","Type":"ContainerStarted","Data":"65c31e20d172877363cdcb2bfacc876168270844f2f8fa5b9feb99da77cc9003"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.291468 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.303893 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr" event={"ID":"8a040614-ef32-4b0d-a5ae-a3336d26bc71","Type":"ContainerStarted","Data":"3cd99ddcbc0192f880386b8c0db5a53650c64086252d8246579e6fe6b8e0e9fa"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.304474 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.312174 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm" event={"ID":"639f5d85-78ac-409d-b2ed-b809cb59bfc5","Type":"ContainerStarted","Data":"350c43321118d477733f4c985c762617c1ff30499532f6ef0ea5fe7bbd7df9f8"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.312820 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.323451 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" podStartSLOduration=4.449236583 podStartE2EDuration="41.323435241s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.738721955 +0000 UTC m=+1023.354911084" lastFinishedPulling="2026-03-20 15:08:58.612920573 +0000 UTC m=+1060.229109742" observedRunningTime="2026-03-20 15:09:01.31551825 +0000 UTC m=+1062.931707379" watchObservedRunningTime="2026-03-20 15:09:01.323435241 +0000 UTC m=+1062.939624370" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.330538 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" event={"ID":"73b3ae86-f483-4083-8eb1-9925cac6b796","Type":"ContainerStarted","Data":"3b3de563cfa2d4e192523fae3511e41ba6968c7ee5a51c90118f2bcdcac8fc7a"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.331118 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.347076 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6" event={"ID":"3b81c755-123b-4caa-bc28-43ce8b672547","Type":"ContainerStarted","Data":"0c0dea50c5baad4d46fa9191306c7b659b3ea3551aaef46f31325fb5758bbf5c"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.347154 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.351173 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq" podStartSLOduration=10.976937255 podStartE2EDuration="42.351156257s" podCreationTimestamp="2026-03-20 15:08:19 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.071495142 +0000 UTC m=+1022.687684271" lastFinishedPulling="2026-03-20 15:08:52.445714094 +0000 UTC m=+1054.061903273" observedRunningTime="2026-03-20 15:09:01.346975649 +0000 UTC m=+1062.963164778" watchObservedRunningTime="2026-03-20 15:09:01.351156257 +0000 UTC m=+1062.967345386" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.359552 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6" event={"ID":"46418e4e-189e-4cc1-8df8-343f53697f68","Type":"ContainerStarted","Data":"7452b522ba4e623ba0282be79f0c1affde08fc2a2be1ea0e6c1d40fe089b5758"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.359841 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.379551 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8" event={"ID":"f38f9570-e767-488b-b3d1-97da8b4afa56","Type":"ContainerStarted","Data":"f6a428ff073f2869d3663847879c2711a226a0445e6f93cdefbd851ca74bdbc1"} Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.379950 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.443349 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9" podStartSLOduration=10.873127227 podStartE2EDuration="41.443332909s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.875523243 +0000 UTC m=+1023.491712372" lastFinishedPulling="2026-03-20 15:08:52.445728895 +0000 UTC m=+1054.061918054" observedRunningTime="2026-03-20 15:09:01.441472702 +0000 UTC m=+1063.057661831" watchObservedRunningTime="2026-03-20 15:09:01.443332909 +0000 UTC m=+1063.059522038" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.535284 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" podStartSLOduration=41.535263833 podStartE2EDuration="41.535263833s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:09:01.523594977 +0000 UTC m=+1063.139784106" watchObservedRunningTime="2026-03-20 15:09:01.535263833 +0000 UTC m=+1063.151452962" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.625081 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jjrw6" podStartSLOduration=3.5384736930000003 podStartE2EDuration="41.625055912s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.962365312 +0000 UTC m=+1023.578554441" lastFinishedPulling="2026-03-20 15:09:00.048947521 +0000 UTC m=+1061.665136660" observedRunningTime="2026-03-20 15:09:01.592157959 +0000 UTC m=+1063.208347088" watchObservedRunningTime="2026-03-20 15:09:01.625055912 +0000 UTC m=+1063.241245051" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.625730 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4" podStartSLOduration=10.097788366 podStartE2EDuration="42.625725493s" podCreationTimestamp="2026-03-20 15:08:19 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.230225115 +0000 UTC m=+1022.846414244" lastFinishedPulling="2026-03-20 15:08:53.758162242 +0000 UTC m=+1055.374351371" observedRunningTime="2026-03-20 15:09:01.617387159 +0000 UTC m=+1063.233576288" watchObservedRunningTime="2026-03-20 15:09:01.625725493 +0000 UTC m=+1063.241914622" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.667096 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2" podStartSLOduration=10.962343094 podStartE2EDuration="42.667081204s" podCreationTimestamp="2026-03-20 15:08:19 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.521104313 +0000 UTC m=+1023.137293442" lastFinishedPulling="2026-03-20 15:08:53.225842383 +0000 UTC m=+1054.842031552" observedRunningTime="2026-03-20 15:09:01.664189646 +0000 UTC m=+1063.280378775" watchObservedRunningTime="2026-03-20 15:09:01.667081204 +0000 UTC m=+1063.283270333" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.709193 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6" podStartSLOduration=11.488844112 podStartE2EDuration="42.709174178s" podCreationTimestamp="2026-03-20 15:08:19 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.225730699 +0000 UTC m=+1022.841919828" lastFinishedPulling="2026-03-20 15:08:52.446060735 +0000 UTC m=+1054.062249894" observedRunningTime="2026-03-20 15:09:01.703411642 +0000 UTC m=+1063.319600771" watchObservedRunningTime="2026-03-20 15:09:01.709174178 +0000 UTC m=+1063.325363307" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.735445 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6" podStartSLOduration=11.182974972 podStartE2EDuration="42.73542976s" podCreationTimestamp="2026-03-20 15:08:19 +0000 UTC" firstStartedPulling="2026-03-20 15:08:20.889841342 +0000 UTC m=+1022.506030461" lastFinishedPulling="2026-03-20 15:08:52.44229608 +0000 UTC m=+1054.058485249" observedRunningTime="2026-03-20 15:09:01.735162701 +0000 UTC m=+1063.351351850" watchObservedRunningTime="2026-03-20 15:09:01.73542976 +0000 UTC m=+1063.351618889" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.789772 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9" podStartSLOduration=11.378465607 podStartE2EDuration="42.789751066s" podCreationTimestamp="2026-03-20 15:08:19 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.03456933 +0000 UTC m=+1022.650758459" lastFinishedPulling="2026-03-20 15:08:52.445854749 +0000 UTC m=+1054.062043918" observedRunningTime="2026-03-20 15:09:01.778199424 +0000 UTC m=+1063.394388553" watchObservedRunningTime="2026-03-20 15:09:01.789751066 +0000 UTC m=+1063.405940195" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.814652 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr" podStartSLOduration=10.116105804 podStartE2EDuration="41.814635025s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.528472438 +0000 UTC m=+1023.144661567" lastFinishedPulling="2026-03-20 15:08:53.227001659 +0000 UTC m=+1054.843190788" observedRunningTime="2026-03-20 15:09:01.811315645 +0000 UTC m=+1063.427504774" watchObservedRunningTime="2026-03-20 15:09:01.814635025 +0000 UTC m=+1063.430824154" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.874536 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8" podStartSLOduration=4.292394973 podStartE2EDuration="42.874522023s" podCreationTimestamp="2026-03-20 15:08:19 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.489003538 +0000 UTC m=+1023.105192657" lastFinishedPulling="2026-03-20 15:09:00.071130568 +0000 UTC m=+1061.687319707" observedRunningTime="2026-03-20 15:09:01.871701177 +0000 UTC m=+1063.487890306" watchObservedRunningTime="2026-03-20 15:09:01.874522023 +0000 UTC m=+1063.490711152" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.938325 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" podStartSLOduration=3.3576069139999998 podStartE2EDuration="41.938307308s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.717850822 +0000 UTC m=+1023.334039951" lastFinishedPulling="2026-03-20 15:09:00.298551226 +0000 UTC m=+1061.914740345" observedRunningTime="2026-03-20 15:09:01.931218022 +0000 UTC m=+1063.547407151" watchObservedRunningTime="2026-03-20 15:09:01.938307308 +0000 UTC m=+1063.554496437" Mar 20 15:09:01 crc kubenswrapper[4764]: I0320 15:09:01.968011 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm" podStartSLOduration=10.521487436 podStartE2EDuration="42.967994074s" podCreationTimestamp="2026-03-20 15:08:19 +0000 UTC" firstStartedPulling="2026-03-20 15:08:20.779315804 +0000 UTC m=+1022.395504933" lastFinishedPulling="2026-03-20 15:08:53.225822442 +0000 UTC m=+1054.842011571" observedRunningTime="2026-03-20 15:09:01.964201868 +0000 UTC m=+1063.580390997" watchObservedRunningTime="2026-03-20 15:09:01.967994074 +0000 UTC m=+1063.584183203" Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.422283 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm" event={"ID":"f111a23d-c6f2-4ca4-9434-aa20eafdf979","Type":"ContainerStarted","Data":"55e9cf38e4736271cc8f03a9d06b829e29b9ea21d9f85d4231953de3f7dc7c3b"} Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.423060 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm" Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.424097 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj" event={"ID":"5783d021-3258-4703-b4d8-0989af31ce65","Type":"ContainerStarted","Data":"e5befee192807a2eba21c250e604cd7b9c31cd0040972f91e3eed24d5b7094bf"} Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.424231 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj" Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.426049 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" event={"ID":"0d25f0a7-3740-4f16-96f3-63b0f587f0a0","Type":"ContainerStarted","Data":"6a43ea6a0cc432314e7c540071b5ae7b5a3894bd9ff407750f5dd941af4c201f"} Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.426236 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.427524 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh" event={"ID":"d5b94bce-37f0-4816-bfa7-6947c258f201","Type":"ContainerStarted","Data":"2187801c8c424c875532723f7485f386b284d12aef927a55bcf180bd22603462"} Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.427705 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh" Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.430652 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" event={"ID":"6efa9b2a-452d-42b2-bb4c-fe8d41747d3f","Type":"ContainerStarted","Data":"1999640d500705639cf53d509d4b581ec9ae7aa07202e86d034cc56cce6c86c4"} Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.430803 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.456477 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm" podStartSLOduration=2.791119244 podStartE2EDuration="46.45644916s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.544098522 +0000 UTC m=+1023.160287641" lastFinishedPulling="2026-03-20 15:09:05.209428428 +0000 UTC m=+1066.825617557" observedRunningTime="2026-03-20 15:09:06.441417052 +0000 UTC m=+1068.057606191" watchObservedRunningTime="2026-03-20 15:09:06.45644916 +0000 UTC m=+1068.072638329" Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.466114 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj" podStartSLOduration=2.971115255 podStartE2EDuration="46.466086655s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.716077428 +0000 UTC m=+1023.332266557" lastFinishedPulling="2026-03-20 15:09:05.211048808 +0000 UTC m=+1066.827237957" observedRunningTime="2026-03-20 15:09:06.455241504 +0000 UTC m=+1068.071430643" watchObservedRunningTime="2026-03-20 15:09:06.466086655 +0000 UTC m=+1068.082275824" Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.475017 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh" podStartSLOduration=2.873576222 podStartE2EDuration="46.474997047s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.60821293 +0000 UTC m=+1023.224402059" lastFinishedPulling="2026-03-20 15:09:05.209633755 +0000 UTC m=+1066.825822884" observedRunningTime="2026-03-20 15:09:06.473848071 +0000 UTC m=+1068.090037230" watchObservedRunningTime="2026-03-20 15:09:06.474997047 +0000 UTC m=+1068.091186186" Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.510013 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" podStartSLOduration=41.694247923 podStartE2EDuration="46.509971993s" podCreationTimestamp="2026-03-20 15:08:20 +0000 UTC" firstStartedPulling="2026-03-20 15:09:00.393370978 +0000 UTC m=+1062.009560107" lastFinishedPulling="2026-03-20 15:09:05.209095048 +0000 UTC m=+1066.825284177" observedRunningTime="2026-03-20 15:09:06.509391995 +0000 UTC m=+1068.125581134" watchObservedRunningTime="2026-03-20 15:09:06.509971993 +0000 UTC m=+1068.126161122" Mar 20 15:09:06 crc kubenswrapper[4764]: I0320 15:09:06.525161 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" podStartSLOduration=42.619481612 podStartE2EDuration="47.525146186s" podCreationTimestamp="2026-03-20 15:08:19 +0000 UTC" firstStartedPulling="2026-03-20 15:09:00.339153234 +0000 UTC m=+1061.955342363" lastFinishedPulling="2026-03-20 15:09:05.244817788 +0000 UTC m=+1066.861006937" observedRunningTime="2026-03-20 15:09:06.52297485 +0000 UTC m=+1068.139163979" watchObservedRunningTime="2026-03-20 15:09:06.525146186 +0000 UTC m=+1068.141335315" Mar 20 15:09:07 crc kubenswrapper[4764]: I0320 15:09:07.437641 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv" event={"ID":"c8815a47-3a15-4fcb-a8eb-c72f767b30f0","Type":"ContainerStarted","Data":"f6933eab18a7155d32ee59dfeb7d8f88806d20460549081d3276858599903611"} Mar 20 15:09:07 crc kubenswrapper[4764]: I0320 15:09:07.457148 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv" podStartSLOduration=3.406869663 podStartE2EDuration="48.457130467s" podCreationTimestamp="2026-03-20 15:08:19 +0000 UTC" firstStartedPulling="2026-03-20 15:08:21.616396919 +0000 UTC m=+1023.232586038" lastFinishedPulling="2026-03-20 15:09:06.666657693 +0000 UTC m=+1068.282846842" observedRunningTime="2026-03-20 15:09:07.454749185 +0000 UTC m=+1069.070938324" watchObservedRunningTime="2026-03-20 15:09:07.457130467 +0000 UTC m=+1069.073319606" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.148784 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2tslm" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.176398 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q6wd6" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.191031 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xrgp9" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.201035 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gh5rq" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.224601 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-5nkz6" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.278885 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-bh9n8" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.323772 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8f4f4" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.343851 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.385520 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-hkvw2" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.410624 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sdqrr" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.424958 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-24cf9" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.449735 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-7lxwm" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.545500 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-pwjdt" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.586089 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-mqh5f" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.632979 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-555bbdc4dc-c94kc" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.681269 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d2jjj" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.755221 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-wpnxh" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.915445 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-skvs9" Mar 20 15:09:10 crc kubenswrapper[4764]: I0320 15:09:10.982741 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-62lwt" Mar 20 15:09:12 crc kubenswrapper[4764]: I0320 15:09:12.114269 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-kj5l7" Mar 20 15:09:12 crc kubenswrapper[4764]: I0320 15:09:12.368214 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-z2q6p" Mar 20 15:09:12 crc kubenswrapper[4764]: I0320 15:09:12.840060 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6f477c5b6b-tqfm2" Mar 20 15:09:20 crc kubenswrapper[4764]: I0320 15:09:20.346150 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dh2nv" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.119326 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-t8725"] Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.121199 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.132586 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.133033 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.133263 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.134080 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-75cbc" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.140041 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-t8725"] Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.192269 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kdmwd"] Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.193278 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.196815 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.202730 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kdmwd"] Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.236706 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b34fb62-e4d1-4fc1-8995-4f69d4709797-config\") pod \"dnsmasq-dns-78dd6ddcc-kdmwd\" (UID: \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.236820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr62c\" (UniqueName: \"kubernetes.io/projected/24cf8196-4f32-493e-bbfd-bb674cee89d1-kube-api-access-cr62c\") pod \"dnsmasq-dns-675f4bcbfc-t8725\" (UID: \"24cf8196-4f32-493e-bbfd-bb674cee89d1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.236881 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7rpf\" (UniqueName: \"kubernetes.io/projected/4b34fb62-e4d1-4fc1-8995-4f69d4709797-kube-api-access-w7rpf\") pod \"dnsmasq-dns-78dd6ddcc-kdmwd\" (UID: \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.236922 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24cf8196-4f32-493e-bbfd-bb674cee89d1-config\") pod \"dnsmasq-dns-675f4bcbfc-t8725\" (UID: \"24cf8196-4f32-493e-bbfd-bb674cee89d1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.236939 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b34fb62-e4d1-4fc1-8995-4f69d4709797-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-kdmwd\" (UID: \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.339058 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr62c\" (UniqueName: \"kubernetes.io/projected/24cf8196-4f32-493e-bbfd-bb674cee89d1-kube-api-access-cr62c\") pod \"dnsmasq-dns-675f4bcbfc-t8725\" (UID: \"24cf8196-4f32-493e-bbfd-bb674cee89d1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.339687 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7rpf\" (UniqueName: \"kubernetes.io/projected/4b34fb62-e4d1-4fc1-8995-4f69d4709797-kube-api-access-w7rpf\") pod \"dnsmasq-dns-78dd6ddcc-kdmwd\" (UID: \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.339786 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24cf8196-4f32-493e-bbfd-bb674cee89d1-config\") pod \"dnsmasq-dns-675f4bcbfc-t8725\" (UID: \"24cf8196-4f32-493e-bbfd-bb674cee89d1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.339835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b34fb62-e4d1-4fc1-8995-4f69d4709797-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-kdmwd\" (UID: \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.339931 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b34fb62-e4d1-4fc1-8995-4f69d4709797-config\") pod \"dnsmasq-dns-78dd6ddcc-kdmwd\" (UID: \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.340793 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24cf8196-4f32-493e-bbfd-bb674cee89d1-config\") pod \"dnsmasq-dns-675f4bcbfc-t8725\" (UID: \"24cf8196-4f32-493e-bbfd-bb674cee89d1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.341868 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b34fb62-e4d1-4fc1-8995-4f69d4709797-config\") pod \"dnsmasq-dns-78dd6ddcc-kdmwd\" (UID: \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.342146 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b34fb62-e4d1-4fc1-8995-4f69d4709797-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-kdmwd\" (UID: \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.367603 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7rpf\" (UniqueName: \"kubernetes.io/projected/4b34fb62-e4d1-4fc1-8995-4f69d4709797-kube-api-access-w7rpf\") pod \"dnsmasq-dns-78dd6ddcc-kdmwd\" (UID: \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\") " pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.378425 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr62c\" (UniqueName: \"kubernetes.io/projected/24cf8196-4f32-493e-bbfd-bb674cee89d1-kube-api-access-cr62c\") pod \"dnsmasq-dns-675f4bcbfc-t8725\" (UID: \"24cf8196-4f32-493e-bbfd-bb674cee89d1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.440726 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.508373 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:09:37 crc kubenswrapper[4764]: I0320 15:09:37.960094 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-t8725"] Mar 20 15:09:38 crc kubenswrapper[4764]: I0320 15:09:38.029098 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kdmwd"] Mar 20 15:09:38 crc kubenswrapper[4764]: W0320 15:09:38.029253 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b34fb62_e4d1_4fc1_8995_4f69d4709797.slice/crio-162fad7ea87f1d1a243245b23383c09296ef35ccc3355c05edad4a7f2b873d79 WatchSource:0}: Error finding container 162fad7ea87f1d1a243245b23383c09296ef35ccc3355c05edad4a7f2b873d79: Status 404 returned error can't find the container with id 162fad7ea87f1d1a243245b23383c09296ef35ccc3355c05edad4a7f2b873d79 Mar 20 15:09:38 crc kubenswrapper[4764]: I0320 15:09:38.724011 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" event={"ID":"4b34fb62-e4d1-4fc1-8995-4f69d4709797","Type":"ContainerStarted","Data":"162fad7ea87f1d1a243245b23383c09296ef35ccc3355c05edad4a7f2b873d79"} Mar 20 15:09:38 crc kubenswrapper[4764]: I0320 15:09:38.725369 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" event={"ID":"24cf8196-4f32-493e-bbfd-bb674cee89d1","Type":"ContainerStarted","Data":"d895ec4388b155a932cf3adc8d68db828f0ca1dae1964c3a70b2a8918ef91819"} Mar 20 15:09:39 crc kubenswrapper[4764]: I0320 15:09:39.925927 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-t8725"] Mar 20 15:09:39 crc kubenswrapper[4764]: I0320 15:09:39.958265 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nvp2n"] Mar 20 15:09:39 crc kubenswrapper[4764]: I0320 15:09:39.959413 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:09:39 crc kubenswrapper[4764]: I0320 15:09:39.966082 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nvp2n"] Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.086856 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wvvj\" (UniqueName: \"kubernetes.io/projected/ce83c633-aed5-474f-abe9-3e17b6856b5e-kube-api-access-6wvvj\") pod \"dnsmasq-dns-5ccc8479f9-nvp2n\" (UID: \"ce83c633-aed5-474f-abe9-3e17b6856b5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.086923 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce83c633-aed5-474f-abe9-3e17b6856b5e-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nvp2n\" (UID: \"ce83c633-aed5-474f-abe9-3e17b6856b5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.086963 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce83c633-aed5-474f-abe9-3e17b6856b5e-config\") pod \"dnsmasq-dns-5ccc8479f9-nvp2n\" (UID: \"ce83c633-aed5-474f-abe9-3e17b6856b5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.187861 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce83c633-aed5-474f-abe9-3e17b6856b5e-config\") pod \"dnsmasq-dns-5ccc8479f9-nvp2n\" (UID: \"ce83c633-aed5-474f-abe9-3e17b6856b5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.187953 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wvvj\" (UniqueName: \"kubernetes.io/projected/ce83c633-aed5-474f-abe9-3e17b6856b5e-kube-api-access-6wvvj\") pod \"dnsmasq-dns-5ccc8479f9-nvp2n\" (UID: \"ce83c633-aed5-474f-abe9-3e17b6856b5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.188006 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce83c633-aed5-474f-abe9-3e17b6856b5e-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nvp2n\" (UID: \"ce83c633-aed5-474f-abe9-3e17b6856b5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.234456 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kdmwd"] Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.243641 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce83c633-aed5-474f-abe9-3e17b6856b5e-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nvp2n\" (UID: \"ce83c633-aed5-474f-abe9-3e17b6856b5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.246371 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce83c633-aed5-474f-abe9-3e17b6856b5e-config\") pod \"dnsmasq-dns-5ccc8479f9-nvp2n\" (UID: \"ce83c633-aed5-474f-abe9-3e17b6856b5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.248738 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jrnjm"] Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.249932 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.251742 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wvvj\" (UniqueName: \"kubernetes.io/projected/ce83c633-aed5-474f-abe9-3e17b6856b5e-kube-api-access-6wvvj\") pod \"dnsmasq-dns-5ccc8479f9-nvp2n\" (UID: \"ce83c633-aed5-474f-abe9-3e17b6856b5e\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.252450 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jrnjm"] Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.278679 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.288899 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqbzh\" (UniqueName: \"kubernetes.io/projected/c1d51d09-6b83-45fd-87b9-9a0302c765e9-kube-api-access-rqbzh\") pod \"dnsmasq-dns-57d769cc4f-jrnjm\" (UID: \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\") " pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.288941 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d51d09-6b83-45fd-87b9-9a0302c765e9-config\") pod \"dnsmasq-dns-57d769cc4f-jrnjm\" (UID: \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\") " pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.288996 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d51d09-6b83-45fd-87b9-9a0302c765e9-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jrnjm\" (UID: \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\") " pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.390116 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d51d09-6b83-45fd-87b9-9a0302c765e9-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jrnjm\" (UID: \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\") " pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.390222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqbzh\" (UniqueName: \"kubernetes.io/projected/c1d51d09-6b83-45fd-87b9-9a0302c765e9-kube-api-access-rqbzh\") pod \"dnsmasq-dns-57d769cc4f-jrnjm\" (UID: \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\") " pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.390250 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d51d09-6b83-45fd-87b9-9a0302c765e9-config\") pod \"dnsmasq-dns-57d769cc4f-jrnjm\" (UID: \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\") " pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.391288 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d51d09-6b83-45fd-87b9-9a0302c765e9-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jrnjm\" (UID: \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\") " pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.391446 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d51d09-6b83-45fd-87b9-9a0302c765e9-config\") pod \"dnsmasq-dns-57d769cc4f-jrnjm\" (UID: \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\") " pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.416136 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqbzh\" (UniqueName: \"kubernetes.io/projected/c1d51d09-6b83-45fd-87b9-9a0302c765e9-kube-api-access-rqbzh\") pod \"dnsmasq-dns-57d769cc4f-jrnjm\" (UID: \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\") " pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.602832 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.777007 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nvp2n"] Mar 20 15:09:40 crc kubenswrapper[4764]: W0320 15:09:40.793104 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce83c633_aed5_474f_abe9_3e17b6856b5e.slice/crio-15d2f7c0e9cea29b79b59a452b8d3a5fa0a68961c8cbaa963abd1c4f7006eb39 WatchSource:0}: Error finding container 15d2f7c0e9cea29b79b59a452b8d3a5fa0a68961c8cbaa963abd1c4f7006eb39: Status 404 returned error can't find the container with id 15d2f7c0e9cea29b79b59a452b8d3a5fa0a68961c8cbaa963abd1c4f7006eb39 Mar 20 15:09:40 crc kubenswrapper[4764]: I0320 15:09:40.837236 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jrnjm"] Mar 20 15:09:40 crc kubenswrapper[4764]: W0320 15:09:40.845848 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1d51d09_6b83_45fd_87b9_9a0302c765e9.slice/crio-b0bfd10640cf913c2fd3921a73589dfb1fdb34108ff44605f0d19b4bedffb659 WatchSource:0}: Error finding container b0bfd10640cf913c2fd3921a73589dfb1fdb34108ff44605f0d19b4bedffb659: Status 404 returned error can't find the container with id b0bfd10640cf913c2fd3921a73589dfb1fdb34108ff44605f0d19b4bedffb659 Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.142337 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.143715 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.143764 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.145635 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.145676 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.145814 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.145920 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jb55x" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.147205 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.147319 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.147446 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.205745 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.205800 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.205943 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.206005 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b497b447-0f6a-47e6-b106-16ca68b88d44-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.206065 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.206168 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.206191 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd4dk\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-kube-api-access-sd4dk\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.206336 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b497b447-0f6a-47e6-b106-16ca68b88d44-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.206428 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.206459 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.206492 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.307471 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.307517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd4dk\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-kube-api-access-sd4dk\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.307575 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b497b447-0f6a-47e6-b106-16ca68b88d44-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.307600 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.307617 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.307666 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.307690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.307971 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.308291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.308356 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.308437 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.308470 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.308489 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b497b447-0f6a-47e6-b106-16ca68b88d44-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.308509 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.326338 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b497b447-0f6a-47e6-b106-16ca68b88d44-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.327356 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.327425 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.336272 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.336469 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b497b447-0f6a-47e6-b106-16ca68b88d44-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.336646 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.337741 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd4dk\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-kube-api-access-sd4dk\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.343969 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.375834 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.377308 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.381807 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.382027 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.382169 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.382339 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.382540 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.382640 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.382736 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-l5h5r" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.406161 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.409370 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dxbk\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-kube-api-access-2dxbk\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.409549 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.409589 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.409615 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.409637 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef1499d4-3bae-40c1-882d-ad9778b9eb80-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.409662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.409677 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.409702 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef1499d4-3bae-40c1-882d-ad9778b9eb80-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.409745 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.409836 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-config-data\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.409850 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.463645 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.475735 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.511211 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-config-data\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.511294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.511360 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dxbk\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-kube-api-access-2dxbk\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.511410 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.511436 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.511456 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.511473 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef1499d4-3bae-40c1-882d-ad9778b9eb80-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.511490 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.511505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.511525 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef1499d4-3bae-40c1-882d-ad9778b9eb80-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.511549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.512227 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.513638 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.513713 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-config-data\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.514682 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.516897 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.520778 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.531745 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.532559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.532896 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef1499d4-3bae-40c1-882d-ad9778b9eb80-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.540297 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dxbk\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-kube-api-access-2dxbk\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.540849 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef1499d4-3bae-40c1-882d-ad9778b9eb80-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.549626 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.711439 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.761173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" event={"ID":"ce83c633-aed5-474f-abe9-3e17b6856b5e","Type":"ContainerStarted","Data":"15d2f7c0e9cea29b79b59a452b8d3a5fa0a68961c8cbaa963abd1c4f7006eb39"} Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.767751 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" event={"ID":"c1d51d09-6b83-45fd-87b9-9a0302c765e9","Type":"ContainerStarted","Data":"b0bfd10640cf913c2fd3921a73589dfb1fdb34108ff44605f0d19b4bedffb659"} Mar 20 15:09:41 crc kubenswrapper[4764]: I0320 15:09:41.999492 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:09:42 crc kubenswrapper[4764]: W0320 15:09:42.005897 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb497b447_0f6a_47e6_b106_16ca68b88d44.slice/crio-4beb67358a9ecc69ecae0219e0286bb747f91402a76ccfe400d1b9e615da848b WatchSource:0}: Error finding container 4beb67358a9ecc69ecae0219e0286bb747f91402a76ccfe400d1b9e615da848b: Status 404 returned error can't find the container with id 4beb67358a9ecc69ecae0219e0286bb747f91402a76ccfe400d1b9e615da848b Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.250520 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:09:42 crc kubenswrapper[4764]: W0320 15:09:42.259580 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef1499d4_3bae_40c1_882d_ad9778b9eb80.slice/crio-9be30d51b62a9de92cdefbadf886cbd1c223e81fc4a632398508ad1b8784d72b WatchSource:0}: Error finding container 9be30d51b62a9de92cdefbadf886cbd1c223e81fc4a632398508ad1b8784d72b: Status 404 returned error can't find the container with id 9be30d51b62a9de92cdefbadf886cbd1c223e81fc4a632398508ad1b8784d72b Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.594551 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.596861 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.599672 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.600026 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.600227 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-tltch" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.600655 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.606461 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.611351 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.736768 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.736812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-config-data-default\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.737483 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.737691 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.737770 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65l7t\" (UniqueName: \"kubernetes.io/projected/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-kube-api-access-65l7t\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.737833 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-kolla-config\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.737867 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.737907 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.790101 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b497b447-0f6a-47e6-b106-16ca68b88d44","Type":"ContainerStarted","Data":"4beb67358a9ecc69ecae0219e0286bb747f91402a76ccfe400d1b9e615da848b"} Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.796830 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef1499d4-3bae-40c1-882d-ad9778b9eb80","Type":"ContainerStarted","Data":"9be30d51b62a9de92cdefbadf886cbd1c223e81fc4a632398508ad1b8784d72b"} Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.839826 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.839875 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65l7t\" (UniqueName: \"kubernetes.io/projected/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-kube-api-access-65l7t\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.839900 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-kolla-config\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.839921 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.839944 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.839986 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.840009 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-config-data-default\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.840194 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.841024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.841284 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-kolla-config\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.842564 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.842595 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.842662 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-config-data-default\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.846830 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.846884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.860445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65l7t\" (UniqueName: \"kubernetes.io/projected/02550cd6-b0c3-4f74-a6d2-c9348fc00cc5-kube-api-access-65l7t\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.860778 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5\") " pod="openstack/openstack-galera-0" Mar 20 15:09:42 crc kubenswrapper[4764]: I0320 15:09:42.921490 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.383719 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.786085 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.787989 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.790699 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.790732 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.790751 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-tsvm2" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.791144 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.807502 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.965605 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ce3534c6-4831-4cf1-9c4a-99bf3e934022-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.965658 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3534c6-4831-4cf1-9c4a-99bf3e934022-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.965928 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce3534c6-4831-4cf1-9c4a-99bf3e934022-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.966067 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ce3534c6-4831-4cf1-9c4a-99bf3e934022-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.966136 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xhc8\" (UniqueName: \"kubernetes.io/projected/ce3534c6-4831-4cf1-9c4a-99bf3e934022-kube-api-access-9xhc8\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.966194 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.966255 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce3534c6-4831-4cf1-9c4a-99bf3e934022-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:43 crc kubenswrapper[4764]: I0320 15:09:43.966282 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce3534c6-4831-4cf1-9c4a-99bf3e934022-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.067309 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ce3534c6-4831-4cf1-9c4a-99bf3e934022-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.067348 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xhc8\" (UniqueName: \"kubernetes.io/projected/ce3534c6-4831-4cf1-9c4a-99bf3e934022-kube-api-access-9xhc8\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.067389 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.067417 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce3534c6-4831-4cf1-9c4a-99bf3e934022-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.067432 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce3534c6-4831-4cf1-9c4a-99bf3e934022-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.067460 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ce3534c6-4831-4cf1-9c4a-99bf3e934022-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.067481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3534c6-4831-4cf1-9c4a-99bf3e934022-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.067535 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce3534c6-4831-4cf1-9c4a-99bf3e934022-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.068571 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ce3534c6-4831-4cf1-9c4a-99bf3e934022-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.068704 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.070356 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce3534c6-4831-4cf1-9c4a-99bf3e934022-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.070771 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ce3534c6-4831-4cf1-9c4a-99bf3e934022-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.070982 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3534c6-4831-4cf1-9c4a-99bf3e934022-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.073286 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce3534c6-4831-4cf1-9c4a-99bf3e934022-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.091089 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce3534c6-4831-4cf1-9c4a-99bf3e934022-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.100252 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.101148 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xhc8\" (UniqueName: \"kubernetes.io/projected/ce3534c6-4831-4cf1-9c4a-99bf3e934022-kube-api-access-9xhc8\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.101352 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ce3534c6-4831-4cf1-9c4a-99bf3e934022\") " pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.102891 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.104823 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.105031 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-7q5kk" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.105292 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.116838 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.123884 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.269639 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/161fc524-f747-4350-ac9c-0670b2a338bb-kolla-config\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.269685 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4jgs\" (UniqueName: \"kubernetes.io/projected/161fc524-f747-4350-ac9c-0670b2a338bb-kube-api-access-m4jgs\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.269769 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/161fc524-f747-4350-ac9c-0670b2a338bb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.269812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/161fc524-f747-4350-ac9c-0670b2a338bb-config-data\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.269872 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161fc524-f747-4350-ac9c-0670b2a338bb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.371729 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/161fc524-f747-4350-ac9c-0670b2a338bb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.371798 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/161fc524-f747-4350-ac9c-0670b2a338bb-config-data\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.371843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161fc524-f747-4350-ac9c-0670b2a338bb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.371883 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/161fc524-f747-4350-ac9c-0670b2a338bb-kolla-config\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.371899 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4jgs\" (UniqueName: \"kubernetes.io/projected/161fc524-f747-4350-ac9c-0670b2a338bb-kube-api-access-m4jgs\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.373676 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/161fc524-f747-4350-ac9c-0670b2a338bb-config-data\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.373920 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/161fc524-f747-4350-ac9c-0670b2a338bb-kolla-config\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.380103 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161fc524-f747-4350-ac9c-0670b2a338bb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.380276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/161fc524-f747-4350-ac9c-0670b2a338bb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.392753 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4jgs\" (UniqueName: \"kubernetes.io/projected/161fc524-f747-4350-ac9c-0670b2a338bb-kube-api-access-m4jgs\") pod \"memcached-0\" (UID: \"161fc524-f747-4350-ac9c-0670b2a338bb\") " pod="openstack/memcached-0" Mar 20 15:09:44 crc kubenswrapper[4764]: I0320 15:09:44.481655 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 15:09:46 crc kubenswrapper[4764]: I0320 15:09:46.346264 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:09:46 crc kubenswrapper[4764]: I0320 15:09:46.347121 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 15:09:46 crc kubenswrapper[4764]: I0320 15:09:46.357519 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-fn5z8" Mar 20 15:09:46 crc kubenswrapper[4764]: I0320 15:09:46.370764 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:09:46 crc kubenswrapper[4764]: I0320 15:09:46.505192 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w7z4\" (UniqueName: \"kubernetes.io/projected/a9084836-71c4-46c4-9cec-f2f2a5489914-kube-api-access-9w7z4\") pod \"kube-state-metrics-0\" (UID: \"a9084836-71c4-46c4-9cec-f2f2a5489914\") " pod="openstack/kube-state-metrics-0" Mar 20 15:09:46 crc kubenswrapper[4764]: I0320 15:09:46.606520 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w7z4\" (UniqueName: \"kubernetes.io/projected/a9084836-71c4-46c4-9cec-f2f2a5489914-kube-api-access-9w7z4\") pod \"kube-state-metrics-0\" (UID: \"a9084836-71c4-46c4-9cec-f2f2a5489914\") " pod="openstack/kube-state-metrics-0" Mar 20 15:09:46 crc kubenswrapper[4764]: I0320 15:09:46.626932 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w7z4\" (UniqueName: \"kubernetes.io/projected/a9084836-71c4-46c4-9cec-f2f2a5489914-kube-api-access-9w7z4\") pod \"kube-state-metrics-0\" (UID: \"a9084836-71c4-46c4-9cec-f2f2a5489914\") " pod="openstack/kube-state-metrics-0" Mar 20 15:09:46 crc kubenswrapper[4764]: I0320 15:09:46.668609 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 15:09:47 crc kubenswrapper[4764]: I0320 15:09:47.848933 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5","Type":"ContainerStarted","Data":"c4896774df07c13873dfb14cf5db629703ed831f3a19b18758fd730ca5e02eba"} Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.541788 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.543277 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.545538 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-cv5k7" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.545792 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.545885 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.546677 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.552554 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.572852 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.656819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.656880 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4501c31c-d4db-4881-b791-c4a004cab3d2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.656936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4501c31c-d4db-4881-b791-c4a004cab3d2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.656970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4501c31c-d4db-4881-b791-c4a004cab3d2-config\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.657160 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4501c31c-d4db-4881-b791-c4a004cab3d2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.657340 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4501c31c-d4db-4881-b791-c4a004cab3d2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.657371 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw4rh\" (UniqueName: \"kubernetes.io/projected/4501c31c-d4db-4881-b791-c4a004cab3d2-kube-api-access-jw4rh\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.657436 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4501c31c-d4db-4881-b791-c4a004cab3d2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.759391 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw4rh\" (UniqueName: \"kubernetes.io/projected/4501c31c-d4db-4881-b791-c4a004cab3d2-kube-api-access-jw4rh\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.759430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4501c31c-d4db-4881-b791-c4a004cab3d2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.759453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4501c31c-d4db-4881-b791-c4a004cab3d2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.759488 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.759505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4501c31c-d4db-4881-b791-c4a004cab3d2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.759533 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4501c31c-d4db-4881-b791-c4a004cab3d2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.759721 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4501c31c-d4db-4881-b791-c4a004cab3d2-config\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.759769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4501c31c-d4db-4881-b791-c4a004cab3d2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.760285 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.761155 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4501c31c-d4db-4881-b791-c4a004cab3d2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.761629 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4501c31c-d4db-4881-b791-c4a004cab3d2-config\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.761711 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4501c31c-d4db-4881-b791-c4a004cab3d2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.766841 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4501c31c-d4db-4881-b791-c4a004cab3d2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.767287 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4501c31c-d4db-4881-b791-c4a004cab3d2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.770620 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4501c31c-d4db-4881-b791-c4a004cab3d2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.788780 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.794451 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw4rh\" (UniqueName: \"kubernetes.io/projected/4501c31c-d4db-4881-b791-c4a004cab3d2-kube-api-access-jw4rh\") pod \"ovsdbserver-nb-0\" (UID: \"4501c31c-d4db-4881-b791-c4a004cab3d2\") " pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.837660 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ncp4w"] Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.839152 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.844952 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.845206 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-mt96c" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.844967 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.845455 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-kb2ph"] Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.847281 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.853708 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ncp4w"] Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.861389 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kb2ph"] Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.875785 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.962497 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-var-log-ovn\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.962575 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-var-log\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.962604 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-var-lib\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.962665 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-scripts\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.962695 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-combined-ca-bundle\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.962744 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-ovn-controller-tls-certs\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.962911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-etc-ovs\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.962986 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tj7x\" (UniqueName: \"kubernetes.io/projected/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-kube-api-access-6tj7x\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.963006 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-scripts\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.963027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-var-run\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.963046 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-var-run-ovn\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.963135 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9zg\" (UniqueName: \"kubernetes.io/projected/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-kube-api-access-sp9zg\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:49 crc kubenswrapper[4764]: I0320 15:09:49.963177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-var-run\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.063970 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-etc-ovs\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064032 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tj7x\" (UniqueName: \"kubernetes.io/projected/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-kube-api-access-6tj7x\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064052 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-scripts\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064075 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-var-run\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064092 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-var-run-ovn\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064111 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9zg\" (UniqueName: \"kubernetes.io/projected/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-kube-api-access-sp9zg\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064139 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-var-run\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064164 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-var-log-ovn\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064185 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-var-log\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064200 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-var-lib\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064218 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-scripts\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064241 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-combined-ca-bundle\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064261 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-ovn-controller-tls-certs\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-etc-ovs\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064772 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-var-run\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064787 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-var-run\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.064853 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-var-run-ovn\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.065157 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-var-lib\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.065269 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-var-log-ovn\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.065353 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-var-log\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.067861 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-scripts\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.068755 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-scripts\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.073303 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-ovn-controller-tls-certs\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.078870 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-combined-ca-bundle\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.092933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tj7x\" (UniqueName: \"kubernetes.io/projected/cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d-kube-api-access-6tj7x\") pod \"ovn-controller-ncp4w\" (UID: \"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d\") " pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.093337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9zg\" (UniqueName: \"kubernetes.io/projected/bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a-kube-api-access-sp9zg\") pod \"ovn-controller-ovs-kb2ph\" (UID: \"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a\") " pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.176101 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ncp4w" Mar 20 15:09:50 crc kubenswrapper[4764]: I0320 15:09:50.192157 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.505666 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.508893 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.513866 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-87vq7" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.514007 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.513872 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.514640 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.523119 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.622713 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.622774 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.622838 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.622883 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-config\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.622939 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.623070 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.623150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh88k\" (UniqueName: \"kubernetes.io/projected/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-kube-api-access-dh88k\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.623252 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.724574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.724649 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.724688 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh88k\" (UniqueName: \"kubernetes.io/projected/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-kube-api-access-dh88k\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.724735 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.724777 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.724808 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.724828 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.724865 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-config\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.725944 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-config\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.726103 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.726591 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.729204 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.738509 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.740580 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.741164 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.748174 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh88k\" (UniqueName: \"kubernetes.io/projected/70c53684-1a46-497b-8b8d-ab4e90fbe6c2-kube-api-access-dh88k\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.750629 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"70c53684-1a46-497b-8b8d-ab4e90fbe6c2\") " pod="openstack/ovsdbserver-sb-0" Mar 20 15:09:53 crc kubenswrapper[4764]: I0320 15:09:53.879688 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 15:10:00 crc kubenswrapper[4764]: I0320 15:10:00.140211 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566990-w9x6v"] Mar 20 15:10:00 crc kubenswrapper[4764]: I0320 15:10:00.142176 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566990-w9x6v" Mar 20 15:10:00 crc kubenswrapper[4764]: I0320 15:10:00.144860 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:10:00 crc kubenswrapper[4764]: I0320 15:10:00.145262 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:10:00 crc kubenswrapper[4764]: I0320 15:10:00.146255 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:10:00 crc kubenswrapper[4764]: I0320 15:10:00.151855 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566990-w9x6v"] Mar 20 15:10:00 crc kubenswrapper[4764]: I0320 15:10:00.241360 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7zzb\" (UniqueName: \"kubernetes.io/projected/a29f5ab4-c8db-4425-91e8-758e08b9caa8-kube-api-access-z7zzb\") pod \"auto-csr-approver-29566990-w9x6v\" (UID: \"a29f5ab4-c8db-4425-91e8-758e08b9caa8\") " pod="openshift-infra/auto-csr-approver-29566990-w9x6v" Mar 20 15:10:00 crc kubenswrapper[4764]: I0320 15:10:00.343058 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7zzb\" (UniqueName: \"kubernetes.io/projected/a29f5ab4-c8db-4425-91e8-758e08b9caa8-kube-api-access-z7zzb\") pod \"auto-csr-approver-29566990-w9x6v\" (UID: \"a29f5ab4-c8db-4425-91e8-758e08b9caa8\") " pod="openshift-infra/auto-csr-approver-29566990-w9x6v" Mar 20 15:10:00 crc kubenswrapper[4764]: I0320 15:10:00.363806 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7zzb\" (UniqueName: \"kubernetes.io/projected/a29f5ab4-c8db-4425-91e8-758e08b9caa8-kube-api-access-z7zzb\") pod \"auto-csr-approver-29566990-w9x6v\" (UID: \"a29f5ab4-c8db-4425-91e8-758e08b9caa8\") " pod="openshift-infra/auto-csr-approver-29566990-w9x6v" Mar 20 15:10:00 crc kubenswrapper[4764]: I0320 15:10:00.533366 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566990-w9x6v" Mar 20 15:10:00 crc kubenswrapper[4764]: E0320 15:10:00.703755 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 15:10:00 crc kubenswrapper[4764]: E0320 15:10:00.703946 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqbzh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-jrnjm_openstack(c1d51d09-6b83-45fd-87b9-9a0302c765e9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:10:00 crc kubenswrapper[4764]: E0320 15:10:00.706436 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" podUID="c1d51d09-6b83-45fd-87b9-9a0302c765e9" Mar 20 15:10:00 crc kubenswrapper[4764]: E0320 15:10:00.961910 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" podUID="c1d51d09-6b83-45fd-87b9-9a0302c765e9" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.711268 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.711906 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-65l7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(02550cd6-b0c3-4f74-a6d2-c9348fc00cc5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.713409 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="02550cd6-b0c3-4f74-a6d2-c9348fc00cc5" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.854280 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.854459 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6wvvj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-nvp2n_openstack(ce83c633-aed5-474f-abe9-3e17b6856b5e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.855913 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" podUID="ce83c633-aed5-474f-abe9-3e17b6856b5e" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.890196 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.890925 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cr62c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-t8725_openstack(24cf8196-4f32-493e-bbfd-bb674cee89d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.892304 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" podUID="24cf8196-4f32-493e-bbfd-bb674cee89d1" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.894238 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.894440 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7rpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-kdmwd_openstack(4b34fb62-e4d1-4fc1-8995-4f69d4709797): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.895864 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" podUID="4b34fb62-e4d1-4fc1-8995-4f69d4709797" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.988637 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="02550cd6-b0c3-4f74-a6d2-c9348fc00cc5" Mar 20 15:10:02 crc kubenswrapper[4764]: E0320 15:10:02.988838 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" podUID="ce83c633-aed5-474f-abe9-3e17b6856b5e" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.526328 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.587963 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.652232 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.688083 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.713510 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.719215 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b34fb62-e4d1-4fc1-8995-4f69d4709797-dns-svc\") pod \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\" (UID: \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\") " Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.719226 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ncp4w"] Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.719253 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24cf8196-4f32-493e-bbfd-bb674cee89d1-config\") pod \"24cf8196-4f32-493e-bbfd-bb674cee89d1\" (UID: \"24cf8196-4f32-493e-bbfd-bb674cee89d1\") " Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.719347 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b34fb62-e4d1-4fc1-8995-4f69d4709797-config\") pod \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\" (UID: \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\") " Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.719473 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7rpf\" (UniqueName: \"kubernetes.io/projected/4b34fb62-e4d1-4fc1-8995-4f69d4709797-kube-api-access-w7rpf\") pod \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\" (UID: \"4b34fb62-e4d1-4fc1-8995-4f69d4709797\") " Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.719532 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr62c\" (UniqueName: \"kubernetes.io/projected/24cf8196-4f32-493e-bbfd-bb674cee89d1-kube-api-access-cr62c\") pod \"24cf8196-4f32-493e-bbfd-bb674cee89d1\" (UID: \"24cf8196-4f32-493e-bbfd-bb674cee89d1\") " Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.719635 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24cf8196-4f32-493e-bbfd-bb674cee89d1-config" (OuterVolumeSpecName: "config") pod "24cf8196-4f32-493e-bbfd-bb674cee89d1" (UID: "24cf8196-4f32-493e-bbfd-bb674cee89d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.720048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b34fb62-e4d1-4fc1-8995-4f69d4709797-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b34fb62-e4d1-4fc1-8995-4f69d4709797" (UID: "4b34fb62-e4d1-4fc1-8995-4f69d4709797"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.720194 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b34fb62-e4d1-4fc1-8995-4f69d4709797-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.720207 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24cf8196-4f32-493e-bbfd-bb674cee89d1-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.721007 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b34fb62-e4d1-4fc1-8995-4f69d4709797-config" (OuterVolumeSpecName: "config") pod "4b34fb62-e4d1-4fc1-8995-4f69d4709797" (UID: "4b34fb62-e4d1-4fc1-8995-4f69d4709797"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.735612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24cf8196-4f32-493e-bbfd-bb674cee89d1-kube-api-access-cr62c" (OuterVolumeSpecName: "kube-api-access-cr62c") pod "24cf8196-4f32-493e-bbfd-bb674cee89d1" (UID: "24cf8196-4f32-493e-bbfd-bb674cee89d1"). InnerVolumeSpecName "kube-api-access-cr62c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.736605 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b34fb62-e4d1-4fc1-8995-4f69d4709797-kube-api-access-w7rpf" (OuterVolumeSpecName: "kube-api-access-w7rpf") pod "4b34fb62-e4d1-4fc1-8995-4f69d4709797" (UID: "4b34fb62-e4d1-4fc1-8995-4f69d4709797"). InnerVolumeSpecName "kube-api-access-w7rpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.772241 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566990-w9x6v"] Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.777815 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.781222 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.821529 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr62c\" (UniqueName: \"kubernetes.io/projected/24cf8196-4f32-493e-bbfd-bb674cee89d1-kube-api-access-cr62c\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.821676 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b34fb62-e4d1-4fc1-8995-4f69d4709797-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.821769 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7rpf\" (UniqueName: \"kubernetes.io/projected/4b34fb62-e4d1-4fc1-8995-4f69d4709797-kube-api-access-w7rpf\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.858009 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-kb2ph"] Mar 20 15:10:03 crc kubenswrapper[4764]: W0320 15:10:03.884144 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf87a2bc_c9cc_4831_851a_ffcfca0e7d9a.slice/crio-18599a037698508be0022ecf9b71dd6217bac9ab8607efec1d77deb03e5f5312 WatchSource:0}: Error finding container 18599a037698508be0022ecf9b71dd6217bac9ab8607efec1d77deb03e5f5312: Status 404 returned error can't find the container with id 18599a037698508be0022ecf9b71dd6217bac9ab8607efec1d77deb03e5f5312 Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.995116 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ce3534c6-4831-4cf1-9c4a-99bf3e934022","Type":"ContainerStarted","Data":"f55e110481bb79dd0b482f8f381e9e804d9adc7e7ba9ba9372aff2968c286084"} Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.997162 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a9084836-71c4-46c4-9cec-f2f2a5489914","Type":"ContainerStarted","Data":"cbad95d598646486a4852eb0ce8d4b3e6a5193c6bf5b2be7910a6bf426ce26cb"} Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.998243 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"161fc524-f747-4350-ac9c-0670b2a338bb","Type":"ContainerStarted","Data":"642fcdbf68bd863835bf77c1ecd502d9036f5baca08858f792a98b55d75a09d3"} Mar 20 15:10:03 crc kubenswrapper[4764]: I0320 15:10:03.999100 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566990-w9x6v" event={"ID":"a29f5ab4-c8db-4425-91e8-758e08b9caa8","Type":"ContainerStarted","Data":"d69295536063a9597ddf3ec79de8e8635d8c3f6bd088f9d205e1534a3d665f26"} Mar 20 15:10:04 crc kubenswrapper[4764]: I0320 15:10:04.000307 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kb2ph" event={"ID":"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a","Type":"ContainerStarted","Data":"18599a037698508be0022ecf9b71dd6217bac9ab8607efec1d77deb03e5f5312"} Mar 20 15:10:04 crc kubenswrapper[4764]: I0320 15:10:04.001630 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" event={"ID":"24cf8196-4f32-493e-bbfd-bb674cee89d1","Type":"ContainerDied","Data":"d895ec4388b155a932cf3adc8d68db828f0ca1dae1964c3a70b2a8918ef91819"} Mar 20 15:10:04 crc kubenswrapper[4764]: I0320 15:10:04.001657 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-t8725" Mar 20 15:10:04 crc kubenswrapper[4764]: I0320 15:10:04.002679 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ncp4w" event={"ID":"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d","Type":"ContainerStarted","Data":"e7cfda8dca9cebd77cc43a6fce138b8e8f3d16bc1ad3977cb1ed9a49930f019a"} Mar 20 15:10:04 crc kubenswrapper[4764]: I0320 15:10:04.003473 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4501c31c-d4db-4881-b791-c4a004cab3d2","Type":"ContainerStarted","Data":"d818c748d3ee331c70b7ea26487ab5e16243eef5ea3ef28730b79bcbfab489d4"} Mar 20 15:10:04 crc kubenswrapper[4764]: I0320 15:10:04.004220 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" event={"ID":"4b34fb62-e4d1-4fc1-8995-4f69d4709797","Type":"ContainerDied","Data":"162fad7ea87f1d1a243245b23383c09296ef35ccc3355c05edad4a7f2b873d79"} Mar 20 15:10:04 crc kubenswrapper[4764]: I0320 15:10:04.004396 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-kdmwd" Mar 20 15:10:04 crc kubenswrapper[4764]: I0320 15:10:04.090731 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-t8725"] Mar 20 15:10:04 crc kubenswrapper[4764]: I0320 15:10:04.098731 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-t8725"] Mar 20 15:10:04 crc kubenswrapper[4764]: I0320 15:10:04.140131 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kdmwd"] Mar 20 15:10:04 crc kubenswrapper[4764]: I0320 15:10:04.145603 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-kdmwd"] Mar 20 15:10:04 crc kubenswrapper[4764]: I0320 15:10:04.803967 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 15:10:04 crc kubenswrapper[4764]: W0320 15:10:04.815280 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70c53684_1a46_497b_8b8d_ab4e90fbe6c2.slice/crio-102f8969c161cf42e7c291055789e0386d2034966234f4a12e86e9f5f2f8f128 WatchSource:0}: Error finding container 102f8969c161cf42e7c291055789e0386d2034966234f4a12e86e9f5f2f8f128: Status 404 returned error can't find the container with id 102f8969c161cf42e7c291055789e0386d2034966234f4a12e86e9f5f2f8f128 Mar 20 15:10:05 crc kubenswrapper[4764]: I0320 15:10:05.012618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ce3534c6-4831-4cf1-9c4a-99bf3e934022","Type":"ContainerStarted","Data":"df61f95d2826b35c74fe0d44c1260a97f339d1265cbde14506ce77abd91ecd74"} Mar 20 15:10:05 crc kubenswrapper[4764]: I0320 15:10:05.015581 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b497b447-0f6a-47e6-b106-16ca68b88d44","Type":"ContainerStarted","Data":"78dbb6b14b25554bc8dfbd32d89aa8308edaafd3936db77f6a6353d6ab780c08"} Mar 20 15:10:05 crc kubenswrapper[4764]: I0320 15:10:05.019047 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"70c53684-1a46-497b-8b8d-ab4e90fbe6c2","Type":"ContainerStarted","Data":"102f8969c161cf42e7c291055789e0386d2034966234f4a12e86e9f5f2f8f128"} Mar 20 15:10:05 crc kubenswrapper[4764]: I0320 15:10:05.022228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef1499d4-3bae-40c1-882d-ad9778b9eb80","Type":"ContainerStarted","Data":"9215e43f927a95a4439a55c48e299c5c67930db52f0f8f187ffa9ea723db8651"} Mar 20 15:10:05 crc kubenswrapper[4764]: I0320 15:10:05.137629 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24cf8196-4f32-493e-bbfd-bb674cee89d1" path="/var/lib/kubelet/pods/24cf8196-4f32-493e-bbfd-bb674cee89d1/volumes" Mar 20 15:10:05 crc kubenswrapper[4764]: I0320 15:10:05.137992 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b34fb62-e4d1-4fc1-8995-4f69d4709797" path="/var/lib/kubelet/pods/4b34fb62-e4d1-4fc1-8995-4f69d4709797/volumes" Mar 20 15:10:08 crc kubenswrapper[4764]: I0320 15:10:08.446089 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:10:08 crc kubenswrapper[4764]: I0320 15:10:08.446586 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:10:09 crc kubenswrapper[4764]: I0320 15:10:09.062496 4764 generic.go:334] "Generic (PLEG): container finished" podID="ce3534c6-4831-4cf1-9c4a-99bf3e934022" containerID="df61f95d2826b35c74fe0d44c1260a97f339d1265cbde14506ce77abd91ecd74" exitCode=0 Mar 20 15:10:09 crc kubenswrapper[4764]: I0320 15:10:09.062554 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ce3534c6-4831-4cf1-9c4a-99bf3e934022","Type":"ContainerDied","Data":"df61f95d2826b35c74fe0d44c1260a97f339d1265cbde14506ce77abd91ecd74"} Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.085150 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a9084836-71c4-46c4-9cec-f2f2a5489914","Type":"ContainerStarted","Data":"5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8"} Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.085899 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.088347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"70c53684-1a46-497b-8b8d-ab4e90fbe6c2","Type":"ContainerStarted","Data":"89e379037386144bda9e2cf0f39ef8d682051a866192628437e8ea87288fa481"} Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.090028 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ncp4w" event={"ID":"cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d","Type":"ContainerStarted","Data":"7be77dbd5b713cdfee1a8b74c9c39494478e67738ba437770977c74b9468f252"} Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.090861 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ncp4w" Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.092638 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4501c31c-d4db-4881-b791-c4a004cab3d2","Type":"ContainerStarted","Data":"c3e61b5f0478a9a3c7de61fd72b78b76e7e0b6659a3a09b2185de339c891aa36"} Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.094755 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"161fc524-f747-4350-ac9c-0670b2a338bb","Type":"ContainerStarted","Data":"2195c0bacd3e718b67c0133d22d98e38cdf28b45ad9fc92c105e9b8d4f4321ab"} Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.095202 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.096421 4764 generic.go:334] "Generic (PLEG): container finished" podID="a29f5ab4-c8db-4425-91e8-758e08b9caa8" containerID="e8310b6a04844394537da70e1c56f7dddc0b9212f6bf023da59cd0327362d61a" exitCode=0 Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.096514 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566990-w9x6v" event={"ID":"a29f5ab4-c8db-4425-91e8-758e08b9caa8","Type":"ContainerDied","Data":"e8310b6a04844394537da70e1c56f7dddc0b9212f6bf023da59cd0327362d61a"} Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.098209 4764 generic.go:334] "Generic (PLEG): container finished" podID="bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a" containerID="1d0cd727d52826bf3a7d2b5b17bea10031d3c30bcd8425feb14a4319c044e6a8" exitCode=0 Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.098309 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kb2ph" event={"ID":"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a","Type":"ContainerDied","Data":"1d0cd727d52826bf3a7d2b5b17bea10031d3c30bcd8425feb14a4319c044e6a8"} Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.101093 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ce3534c6-4831-4cf1-9c4a-99bf3e934022","Type":"ContainerStarted","Data":"0ae5e27ea8869e9afde8c972a0e2cf413a48fa893c0f776320fa52317ebd99a4"} Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.116779 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.699848708 podStartE2EDuration="25.116746773s" podCreationTimestamp="2026-03-20 15:09:46 +0000 UTC" firstStartedPulling="2026-03-20 15:10:03.689316307 +0000 UTC m=+1125.305505436" lastFinishedPulling="2026-03-20 15:10:10.106214362 +0000 UTC m=+1131.722403501" observedRunningTime="2026-03-20 15:10:11.110291722 +0000 UTC m=+1132.726480861" watchObservedRunningTime="2026-03-20 15:10:11.116746773 +0000 UTC m=+1132.732935942" Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.143103 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ncp4w" podStartSLOduration=16.328704991 podStartE2EDuration="22.143084371s" podCreationTimestamp="2026-03-20 15:09:49 +0000 UTC" firstStartedPulling="2026-03-20 15:10:03.692451855 +0000 UTC m=+1125.308640994" lastFinishedPulling="2026-03-20 15:10:09.506831245 +0000 UTC m=+1131.123020374" observedRunningTime="2026-03-20 15:10:11.135619139 +0000 UTC m=+1132.751808268" watchObservedRunningTime="2026-03-20 15:10:11.143084371 +0000 UTC m=+1132.759273520" Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.197859 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.589159049 podStartE2EDuration="27.197836244s" podCreationTimestamp="2026-03-20 15:09:44 +0000 UTC" firstStartedPulling="2026-03-20 15:10:03.687785309 +0000 UTC m=+1125.303974438" lastFinishedPulling="2026-03-20 15:10:09.296462464 +0000 UTC m=+1130.912651633" observedRunningTime="2026-03-20 15:10:11.161968619 +0000 UTC m=+1132.778157748" watchObservedRunningTime="2026-03-20 15:10:11.197836244 +0000 UTC m=+1132.814025373" Mar 20 15:10:11 crc kubenswrapper[4764]: I0320 15:10:11.204107 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=28.208887494 podStartE2EDuration="29.204092059s" podCreationTimestamp="2026-03-20 15:09:42 +0000 UTC" firstStartedPulling="2026-03-20 15:10:03.696713207 +0000 UTC m=+1125.312902356" lastFinishedPulling="2026-03-20 15:10:04.691917772 +0000 UTC m=+1126.308106921" observedRunningTime="2026-03-20 15:10:11.189802934 +0000 UTC m=+1132.805992113" watchObservedRunningTime="2026-03-20 15:10:11.204092059 +0000 UTC m=+1132.820281188" Mar 20 15:10:12 crc kubenswrapper[4764]: I0320 15:10:12.114892 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kb2ph" event={"ID":"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a","Type":"ContainerStarted","Data":"c113c95c7e858220aef50bd5d57f719b94968ae059bab43e913653fb49cbc980"} Mar 20 15:10:12 crc kubenswrapper[4764]: I0320 15:10:12.115199 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-kb2ph" event={"ID":"bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a","Type":"ContainerStarted","Data":"d1f7261de4b29916f5b51b810c4193ad9826b87fe03bbb587c38e229698825d8"} Mar 20 15:10:12 crc kubenswrapper[4764]: I0320 15:10:12.115353 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:10:12 crc kubenswrapper[4764]: I0320 15:10:12.138311 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-kb2ph" podStartSLOduration=17.599463285 podStartE2EDuration="23.138292316s" podCreationTimestamp="2026-03-20 15:09:49 +0000 UTC" firstStartedPulling="2026-03-20 15:10:03.886053585 +0000 UTC m=+1125.502242714" lastFinishedPulling="2026-03-20 15:10:09.424882606 +0000 UTC m=+1131.041071745" observedRunningTime="2026-03-20 15:10:12.137619465 +0000 UTC m=+1133.753808594" watchObservedRunningTime="2026-03-20 15:10:12.138292316 +0000 UTC m=+1133.754481445" Mar 20 15:10:12 crc kubenswrapper[4764]: I0320 15:10:12.381256 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566990-w9x6v" Mar 20 15:10:12 crc kubenswrapper[4764]: I0320 15:10:12.476707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7zzb\" (UniqueName: \"kubernetes.io/projected/a29f5ab4-c8db-4425-91e8-758e08b9caa8-kube-api-access-z7zzb\") pod \"a29f5ab4-c8db-4425-91e8-758e08b9caa8\" (UID: \"a29f5ab4-c8db-4425-91e8-758e08b9caa8\") " Mar 20 15:10:12 crc kubenswrapper[4764]: I0320 15:10:12.482273 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29f5ab4-c8db-4425-91e8-758e08b9caa8-kube-api-access-z7zzb" (OuterVolumeSpecName: "kube-api-access-z7zzb") pod "a29f5ab4-c8db-4425-91e8-758e08b9caa8" (UID: "a29f5ab4-c8db-4425-91e8-758e08b9caa8"). InnerVolumeSpecName "kube-api-access-z7zzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:12 crc kubenswrapper[4764]: I0320 15:10:12.578528 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7zzb\" (UniqueName: \"kubernetes.io/projected/a29f5ab4-c8db-4425-91e8-758e08b9caa8-kube-api-access-z7zzb\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.134725 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566990-w9x6v" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.141293 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566990-w9x6v" event={"ID":"a29f5ab4-c8db-4425-91e8-758e08b9caa8","Type":"ContainerDied","Data":"d69295536063a9597ddf3ec79de8e8635d8c3f6bd088f9d205e1534a3d665f26"} Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.141337 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d69295536063a9597ddf3ec79de8e8635d8c3f6bd088f9d205e1534a3d665f26" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.141353 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.289974 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xlzb7"] Mar 20 15:10:13 crc kubenswrapper[4764]: E0320 15:10:13.290282 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a29f5ab4-c8db-4425-91e8-758e08b9caa8" containerName="oc" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.290299 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29f5ab4-c8db-4425-91e8-758e08b9caa8" containerName="oc" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.290471 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a29f5ab4-c8db-4425-91e8-758e08b9caa8" containerName="oc" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.291037 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.293200 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.311113 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xlzb7"] Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.389171 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnqnp\" (UniqueName: \"kubernetes.io/projected/8a5cf141-3541-4806-83e8-3338f7c2865c-kube-api-access-lnqnp\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.389220 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5cf141-3541-4806-83e8-3338f7c2865c-config\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.389247 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5cf141-3541-4806-83e8-3338f7c2865c-combined-ca-bundle\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.389265 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8a5cf141-3541-4806-83e8-3338f7c2865c-ovs-rundir\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.389336 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8a5cf141-3541-4806-83e8-3338f7c2865c-ovn-rundir\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.389372 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a5cf141-3541-4806-83e8-3338f7c2865c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.450641 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566984-rtmj8"] Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.458735 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566984-rtmj8"] Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.483550 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nvp2n"] Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.490206 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a5cf141-3541-4806-83e8-3338f7c2865c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.490297 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnqnp\" (UniqueName: \"kubernetes.io/projected/8a5cf141-3541-4806-83e8-3338f7c2865c-kube-api-access-lnqnp\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.490318 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5cf141-3541-4806-83e8-3338f7c2865c-config\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.490347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5cf141-3541-4806-83e8-3338f7c2865c-combined-ca-bundle\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.490372 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8a5cf141-3541-4806-83e8-3338f7c2865c-ovs-rundir\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.490462 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8a5cf141-3541-4806-83e8-3338f7c2865c-ovn-rundir\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.490760 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/8a5cf141-3541-4806-83e8-3338f7c2865c-ovn-rundir\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.491481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/8a5cf141-3541-4806-83e8-3338f7c2865c-ovs-rundir\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.491753 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a5cf141-3541-4806-83e8-3338f7c2865c-config\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.496270 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a5cf141-3541-4806-83e8-3338f7c2865c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.498633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5cf141-3541-4806-83e8-3338f7c2865c-combined-ca-bundle\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.535934 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnqnp\" (UniqueName: \"kubernetes.io/projected/8a5cf141-3541-4806-83e8-3338f7c2865c-kube-api-access-lnqnp\") pod \"ovn-controller-metrics-xlzb7\" (UID: \"8a5cf141-3541-4806-83e8-3338f7c2865c\") " pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.542980 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-ng5fq"] Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.544273 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.547413 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.550418 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-ng5fq"] Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.591552 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-ng5fq\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.591802 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ggpf\" (UniqueName: \"kubernetes.io/projected/7b1a9bbb-2574-4580-bad4-631e401e074f-kube-api-access-8ggpf\") pod \"dnsmasq-dns-7f896c8c65-ng5fq\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.591833 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-ng5fq\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.591886 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-config\") pod \"dnsmasq-dns-7f896c8c65-ng5fq\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.616160 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xlzb7" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.681215 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jrnjm"] Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.692864 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-config\") pod \"dnsmasq-dns-7f896c8c65-ng5fq\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.692937 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-ng5fq\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.692976 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ggpf\" (UniqueName: \"kubernetes.io/projected/7b1a9bbb-2574-4580-bad4-631e401e074f-kube-api-access-8ggpf\") pod \"dnsmasq-dns-7f896c8c65-ng5fq\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.693001 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-ng5fq\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.693778 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-ng5fq\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.694254 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-config\") pod \"dnsmasq-dns-7f896c8c65-ng5fq\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.694737 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-ng5fq\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.716719 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-77jwd"] Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.718129 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.721616 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.744426 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ggpf\" (UniqueName: \"kubernetes.io/projected/7b1a9bbb-2574-4580-bad4-631e401e074f-kube-api-access-8ggpf\") pod \"dnsmasq-dns-7f896c8c65-ng5fq\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.768644 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-77jwd"] Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.793440 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.793491 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nr9s\" (UniqueName: \"kubernetes.io/projected/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-kube-api-access-6nr9s\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.793532 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-config\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.793581 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.793606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.894596 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.894644 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.894692 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.894733 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nr9s\" (UniqueName: \"kubernetes.io/projected/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-kube-api-access-6nr9s\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.894772 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-config\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.895749 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.895756 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-config\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.895754 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.895774 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.913947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nr9s\" (UniqueName: \"kubernetes.io/projected/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-kube-api-access-6nr9s\") pod \"dnsmasq-dns-86db49b7ff-77jwd\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:13 crc kubenswrapper[4764]: I0320 15:10:13.917153 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.093282 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.117892 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.117949 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.679139 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.808040 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce83c633-aed5-474f-abe9-3e17b6856b5e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce83c633-aed5-474f-abe9-3e17b6856b5e" (UID: "ce83c633-aed5-474f-abe9-3e17b6856b5e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.808088 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce83c633-aed5-474f-abe9-3e17b6856b5e-dns-svc\") pod \"ce83c633-aed5-474f-abe9-3e17b6856b5e\" (UID: \"ce83c633-aed5-474f-abe9-3e17b6856b5e\") " Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.808199 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wvvj\" (UniqueName: \"kubernetes.io/projected/ce83c633-aed5-474f-abe9-3e17b6856b5e-kube-api-access-6wvvj\") pod \"ce83c633-aed5-474f-abe9-3e17b6856b5e\" (UID: \"ce83c633-aed5-474f-abe9-3e17b6856b5e\") " Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.808280 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce83c633-aed5-474f-abe9-3e17b6856b5e-config\") pod \"ce83c633-aed5-474f-abe9-3e17b6856b5e\" (UID: \"ce83c633-aed5-474f-abe9-3e17b6856b5e\") " Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.808776 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce83c633-aed5-474f-abe9-3e17b6856b5e-config" (OuterVolumeSpecName: "config") pod "ce83c633-aed5-474f-abe9-3e17b6856b5e" (UID: "ce83c633-aed5-474f-abe9-3e17b6856b5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.808995 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce83c633-aed5-474f-abe9-3e17b6856b5e-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.809011 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce83c633-aed5-474f-abe9-3e17b6856b5e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.813498 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce83c633-aed5-474f-abe9-3e17b6856b5e-kube-api-access-6wvvj" (OuterVolumeSpecName: "kube-api-access-6wvvj") pod "ce83c633-aed5-474f-abe9-3e17b6856b5e" (UID: "ce83c633-aed5-474f-abe9-3e17b6856b5e"). InnerVolumeSpecName "kube-api-access-6wvvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:14 crc kubenswrapper[4764]: I0320 15:10:14.910244 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wvvj\" (UniqueName: \"kubernetes.io/projected/ce83c633-aed5-474f-abe9-3e17b6856b5e-kube-api-access-6wvvj\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.138172 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0526618-5c59-4e4b-a854-cd1d61c50c53" path="/var/lib/kubelet/pods/c0526618-5c59-4e4b-a854-cd1d61c50c53/volumes" Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.149767 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" event={"ID":"ce83c633-aed5-474f-abe9-3e17b6856b5e","Type":"ContainerDied","Data":"15d2f7c0e9cea29b79b59a452b8d3a5fa0a68961c8cbaa963abd1c4f7006eb39"} Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.149859 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nvp2n" Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.203507 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nvp2n"] Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.211098 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nvp2n"] Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.805428 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.829181 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d51d09-6b83-45fd-87b9-9a0302c765e9-dns-svc\") pod \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\" (UID: \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\") " Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.829623 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d51d09-6b83-45fd-87b9-9a0302c765e9-config\") pod \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\" (UID: \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\") " Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.829798 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqbzh\" (UniqueName: \"kubernetes.io/projected/c1d51d09-6b83-45fd-87b9-9a0302c765e9-kube-api-access-rqbzh\") pod \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\" (UID: \"c1d51d09-6b83-45fd-87b9-9a0302c765e9\") " Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.829828 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d51d09-6b83-45fd-87b9-9a0302c765e9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c1d51d09-6b83-45fd-87b9-9a0302c765e9" (UID: "c1d51d09-6b83-45fd-87b9-9a0302c765e9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.830065 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d51d09-6b83-45fd-87b9-9a0302c765e9-config" (OuterVolumeSpecName: "config") pod "c1d51d09-6b83-45fd-87b9-9a0302c765e9" (UID: "c1d51d09-6b83-45fd-87b9-9a0302c765e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.830497 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d51d09-6b83-45fd-87b9-9a0302c765e9-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.830519 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d51d09-6b83-45fd-87b9-9a0302c765e9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.844035 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d51d09-6b83-45fd-87b9-9a0302c765e9-kube-api-access-rqbzh" (OuterVolumeSpecName: "kube-api-access-rqbzh") pod "c1d51d09-6b83-45fd-87b9-9a0302c765e9" (UID: "c1d51d09-6b83-45fd-87b9-9a0302c765e9"). InnerVolumeSpecName "kube-api-access-rqbzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:15 crc kubenswrapper[4764]: I0320 15:10:15.932252 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqbzh\" (UniqueName: \"kubernetes.io/projected/c1d51d09-6b83-45fd-87b9-9a0302c765e9-kube-api-access-rqbzh\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:16 crc kubenswrapper[4764]: I0320 15:10:16.158832 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" event={"ID":"c1d51d09-6b83-45fd-87b9-9a0302c765e9","Type":"ContainerDied","Data":"b0bfd10640cf913c2fd3921a73589dfb1fdb34108ff44605f0d19b4bedffb659"} Mar 20 15:10:16 crc kubenswrapper[4764]: I0320 15:10:16.159180 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jrnjm" Mar 20 15:10:16 crc kubenswrapper[4764]: I0320 15:10:16.197071 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-ng5fq"] Mar 20 15:10:16 crc kubenswrapper[4764]: I0320 15:10:16.222145 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jrnjm"] Mar 20 15:10:16 crc kubenswrapper[4764]: I0320 15:10:16.231066 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jrnjm"] Mar 20 15:10:16 crc kubenswrapper[4764]: W0320 15:10:16.376314 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b1a9bbb_2574_4580_bad4_631e401e074f.slice/crio-757d392ec21ecd8a74e3ccbe4f5aa66696cf0488eaff1108722598a6a07607b8 WatchSource:0}: Error finding container 757d392ec21ecd8a74e3ccbe4f5aa66696cf0488eaff1108722598a6a07607b8: Status 404 returned error can't find the container with id 757d392ec21ecd8a74e3ccbe4f5aa66696cf0488eaff1108722598a6a07607b8 Mar 20 15:10:16 crc kubenswrapper[4764]: I0320 15:10:16.675704 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 15:10:16 crc kubenswrapper[4764]: I0320 15:10:16.859986 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xlzb7"] Mar 20 15:10:16 crc kubenswrapper[4764]: W0320 15:10:16.870283 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a5cf141_3541_4806_83e8_3338f7c2865c.slice/crio-73d86dfe6cd7bb8ee26355db67eb49322563008caefef1455b9650d8c485005b WatchSource:0}: Error finding container 73d86dfe6cd7bb8ee26355db67eb49322563008caefef1455b9650d8c485005b: Status 404 returned error can't find the container with id 73d86dfe6cd7bb8ee26355db67eb49322563008caefef1455b9650d8c485005b Mar 20 15:10:16 crc kubenswrapper[4764]: I0320 15:10:16.902771 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-77jwd"] Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.148763 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1d51d09-6b83-45fd-87b9-9a0302c765e9" path="/var/lib/kubelet/pods/c1d51d09-6b83-45fd-87b9-9a0302c765e9/volumes" Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.150173 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce83c633-aed5-474f-abe9-3e17b6856b5e" path="/var/lib/kubelet/pods/ce83c633-aed5-474f-abe9-3e17b6856b5e/volumes" Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.172459 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4501c31c-d4db-4881-b791-c4a004cab3d2","Type":"ContainerStarted","Data":"f4a5941b69fb285a062bf130b8c54a9f12b02fd7748f4c941b533143ce3a82e2"} Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.173940 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" event={"ID":"fb2661b6-35bf-4ad6-ad3d-b84de19786fb","Type":"ContainerStarted","Data":"7f6bd52133c8536e48f15ca9209c0e240e0343d1fa494d3a87252d6f2e7487cc"} Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.177370 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xlzb7" event={"ID":"8a5cf141-3541-4806-83e8-3338f7c2865c","Type":"ContainerStarted","Data":"e731cc03ef36f8d530871296374e172b987cb766c87a925985c80029b9eac2fd"} Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.177410 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xlzb7" event={"ID":"8a5cf141-3541-4806-83e8-3338f7c2865c","Type":"ContainerStarted","Data":"73d86dfe6cd7bb8ee26355db67eb49322563008caefef1455b9650d8c485005b"} Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.179661 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"70c53684-1a46-497b-8b8d-ab4e90fbe6c2","Type":"ContainerStarted","Data":"4a156a46243863200bae99039ddae390398db653c2cfe449c937a3d453e079b1"} Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.181199 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" event={"ID":"7b1a9bbb-2574-4580-bad4-631e401e074f","Type":"ContainerStarted","Data":"757d392ec21ecd8a74e3ccbe4f5aa66696cf0488eaff1108722598a6a07607b8"} Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.200349 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.044526692 podStartE2EDuration="29.200326503s" podCreationTimestamp="2026-03-20 15:09:48 +0000 UTC" firstStartedPulling="2026-03-20 15:10:03.788019216 +0000 UTC m=+1125.404208345" lastFinishedPulling="2026-03-20 15:10:16.943819027 +0000 UTC m=+1138.560008156" observedRunningTime="2026-03-20 15:10:17.189677402 +0000 UTC m=+1138.805866531" watchObservedRunningTime="2026-03-20 15:10:17.200326503 +0000 UTC m=+1138.816515632" Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.212632 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xlzb7" podStartSLOduration=4.212615405 podStartE2EDuration="4.212615405s" podCreationTimestamp="2026-03-20 15:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:10:17.210878011 +0000 UTC m=+1138.827067140" watchObservedRunningTime="2026-03-20 15:10:17.212615405 +0000 UTC m=+1138.828804534" Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.249346 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=13.332618012 podStartE2EDuration="25.249327556s" podCreationTimestamp="2026-03-20 15:09:52 +0000 UTC" firstStartedPulling="2026-03-20 15:10:04.819652883 +0000 UTC m=+1126.435842012" lastFinishedPulling="2026-03-20 15:10:16.736362427 +0000 UTC m=+1138.352551556" observedRunningTime="2026-03-20 15:10:17.237372295 +0000 UTC m=+1138.853561424" watchObservedRunningTime="2026-03-20 15:10:17.249327556 +0000 UTC m=+1138.865516695" Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.881040 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 15:10:17 crc kubenswrapper[4764]: I0320 15:10:17.947541 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 15:10:18 crc kubenswrapper[4764]: I0320 15:10:18.203021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" event={"ID":"fb2661b6-35bf-4ad6-ad3d-b84de19786fb","Type":"ContainerStarted","Data":"6191a234239791e6c54e4e0eaddf3d22537fbed1c16edc4ceff26570970df997"} Mar 20 15:10:18 crc kubenswrapper[4764]: I0320 15:10:18.214216 4764 generic.go:334] "Generic (PLEG): container finished" podID="7b1a9bbb-2574-4580-bad4-631e401e074f" containerID="0e13b9b2d81ac06521c60f69ef4f304605c1f7f78d6f88abc68cd82038773b41" exitCode=0 Mar 20 15:10:18 crc kubenswrapper[4764]: I0320 15:10:18.214319 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" event={"ID":"7b1a9bbb-2574-4580-bad4-631e401e074f","Type":"ContainerDied","Data":"0e13b9b2d81ac06521c60f69ef4f304605c1f7f78d6f88abc68cd82038773b41"} Mar 20 15:10:18 crc kubenswrapper[4764]: I0320 15:10:18.216831 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 15:10:18 crc kubenswrapper[4764]: I0320 15:10:18.231513 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 15:10:18 crc kubenswrapper[4764]: I0320 15:10:18.283700 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 15:10:18 crc kubenswrapper[4764]: I0320 15:10:18.369707 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 15:10:19 crc kubenswrapper[4764]: I0320 15:10:19.227540 4764 generic.go:334] "Generic (PLEG): container finished" podID="fb2661b6-35bf-4ad6-ad3d-b84de19786fb" containerID="6191a234239791e6c54e4e0eaddf3d22537fbed1c16edc4ceff26570970df997" exitCode=0 Mar 20 15:10:19 crc kubenswrapper[4764]: I0320 15:10:19.227637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" event={"ID":"fb2661b6-35bf-4ad6-ad3d-b84de19786fb","Type":"ContainerDied","Data":"6191a234239791e6c54e4e0eaddf3d22537fbed1c16edc4ceff26570970df997"} Mar 20 15:10:19 crc kubenswrapper[4764]: I0320 15:10:19.230447 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5","Type":"ContainerStarted","Data":"853ba1e40eef674d7ca5d25caa2ed6c88027ba1617c01c85ff4c46bff5d4b2dc"} Mar 20 15:10:19 crc kubenswrapper[4764]: I0320 15:10:19.234309 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" event={"ID":"7b1a9bbb-2574-4580-bad4-631e401e074f","Type":"ContainerStarted","Data":"8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8"} Mar 20 15:10:19 crc kubenswrapper[4764]: I0320 15:10:19.234849 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:19 crc kubenswrapper[4764]: I0320 15:10:19.301056 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" podStartSLOduration=5.129857115 podStartE2EDuration="6.301032191s" podCreationTimestamp="2026-03-20 15:10:13 +0000 UTC" firstStartedPulling="2026-03-20 15:10:16.38076876 +0000 UTC m=+1137.996957899" lastFinishedPulling="2026-03-20 15:10:17.551943846 +0000 UTC m=+1139.168132975" observedRunningTime="2026-03-20 15:10:19.292821076 +0000 UTC m=+1140.909010225" watchObservedRunningTime="2026-03-20 15:10:19.301032191 +0000 UTC m=+1140.917221360" Mar 20 15:10:19 crc kubenswrapper[4764]: I0320 15:10:19.483311 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 15:10:19 crc kubenswrapper[4764]: I0320 15:10:19.876044 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 15:10:19 crc kubenswrapper[4764]: I0320 15:10:19.876111 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 15:10:19 crc kubenswrapper[4764]: I0320 15:10:19.910431 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.247643 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" event={"ID":"fb2661b6-35bf-4ad6-ad3d-b84de19786fb","Type":"ContainerStarted","Data":"e498391585427d43e0214f3257a12ad85528b7e316ab0f5e2ce7e3ead20f9568"} Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.248333 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.285974 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" podStartSLOduration=6.418439553 podStartE2EDuration="7.285947317s" podCreationTimestamp="2026-03-20 15:10:13 +0000 UTC" firstStartedPulling="2026-03-20 15:10:16.915662272 +0000 UTC m=+1138.531851411" lastFinishedPulling="2026-03-20 15:10:17.783170006 +0000 UTC m=+1139.399359175" observedRunningTime="2026-03-20 15:10:20.272271021 +0000 UTC m=+1141.888460170" watchObservedRunningTime="2026-03-20 15:10:20.285947317 +0000 UTC m=+1141.902136466" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.306717 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.459883 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.461776 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.465116 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.465864 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gfqc5" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.466358 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.474041 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.474803 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.522790 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.522860 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-config\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.522904 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.522948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87q89\" (UniqueName: \"kubernetes.io/projected/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-kube-api-access-87q89\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.523000 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.523032 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.523057 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-scripts\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.624076 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-config\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.624139 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.624178 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87q89\" (UniqueName: \"kubernetes.io/projected/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-kube-api-access-87q89\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.624206 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.624347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.624987 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-scripts\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.625021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.625154 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-config\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.625330 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.625746 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-scripts\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.629795 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.629957 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.630116 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.650081 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87q89\" (UniqueName: \"kubernetes.io/projected/6a34707a-c81b-4987-b7cb-59ad7b8fa2ef-kube-api-access-87q89\") pod \"ovn-northd-0\" (UID: \"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef\") " pod="openstack/ovn-northd-0" Mar 20 15:10:20 crc kubenswrapper[4764]: I0320 15:10:20.780791 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 15:10:21 crc kubenswrapper[4764]: I0320 15:10:21.293588 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 15:10:21 crc kubenswrapper[4764]: W0320 15:10:21.302850 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a34707a_c81b_4987_b7cb_59ad7b8fa2ef.slice/crio-e6b87e4329e48d002770bfe7dfe93894cc4f01e883bb8b29c96acefc0b397ee4 WatchSource:0}: Error finding container e6b87e4329e48d002770bfe7dfe93894cc4f01e883bb8b29c96acefc0b397ee4: Status 404 returned error can't find the container with id e6b87e4329e48d002770bfe7dfe93894cc4f01e883bb8b29c96acefc0b397ee4 Mar 20 15:10:22 crc kubenswrapper[4764]: I0320 15:10:22.266012 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef","Type":"ContainerStarted","Data":"e6b87e4329e48d002770bfe7dfe93894cc4f01e883bb8b29c96acefc0b397ee4"} Mar 20 15:10:22 crc kubenswrapper[4764]: I0320 15:10:22.882555 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-l9l9w"] Mar 20 15:10:22 crc kubenswrapper[4764]: I0320 15:10:22.884067 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-l9l9w" Mar 20 15:10:22 crc kubenswrapper[4764]: I0320 15:10:22.892733 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 15:10:22 crc kubenswrapper[4764]: I0320 15:10:22.898972 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-l9l9w"] Mar 20 15:10:22 crc kubenswrapper[4764]: I0320 15:10:22.966703 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b455533-d0f1-4672-b0c2-4a773e2caeb1-operator-scripts\") pod \"root-account-create-update-l9l9w\" (UID: \"8b455533-d0f1-4672-b0c2-4a773e2caeb1\") " pod="openstack/root-account-create-update-l9l9w" Mar 20 15:10:22 crc kubenswrapper[4764]: I0320 15:10:22.966877 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4glv6\" (UniqueName: \"kubernetes.io/projected/8b455533-d0f1-4672-b0c2-4a773e2caeb1-kube-api-access-4glv6\") pod \"root-account-create-update-l9l9w\" (UID: \"8b455533-d0f1-4672-b0c2-4a773e2caeb1\") " pod="openstack/root-account-create-update-l9l9w" Mar 20 15:10:23 crc kubenswrapper[4764]: I0320 15:10:23.069663 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4glv6\" (UniqueName: \"kubernetes.io/projected/8b455533-d0f1-4672-b0c2-4a773e2caeb1-kube-api-access-4glv6\") pod \"root-account-create-update-l9l9w\" (UID: \"8b455533-d0f1-4672-b0c2-4a773e2caeb1\") " pod="openstack/root-account-create-update-l9l9w" Mar 20 15:10:23 crc kubenswrapper[4764]: I0320 15:10:23.075551 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b455533-d0f1-4672-b0c2-4a773e2caeb1-operator-scripts\") pod \"root-account-create-update-l9l9w\" (UID: \"8b455533-d0f1-4672-b0c2-4a773e2caeb1\") " pod="openstack/root-account-create-update-l9l9w" Mar 20 15:10:23 crc kubenswrapper[4764]: I0320 15:10:23.076511 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b455533-d0f1-4672-b0c2-4a773e2caeb1-operator-scripts\") pod \"root-account-create-update-l9l9w\" (UID: \"8b455533-d0f1-4672-b0c2-4a773e2caeb1\") " pod="openstack/root-account-create-update-l9l9w" Mar 20 15:10:23 crc kubenswrapper[4764]: I0320 15:10:23.104032 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4glv6\" (UniqueName: \"kubernetes.io/projected/8b455533-d0f1-4672-b0c2-4a773e2caeb1-kube-api-access-4glv6\") pod \"root-account-create-update-l9l9w\" (UID: \"8b455533-d0f1-4672-b0c2-4a773e2caeb1\") " pod="openstack/root-account-create-update-l9l9w" Mar 20 15:10:23 crc kubenswrapper[4764]: I0320 15:10:23.258134 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-l9l9w" Mar 20 15:10:23 crc kubenswrapper[4764]: I0320 15:10:23.568141 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-l9l9w"] Mar 20 15:10:23 crc kubenswrapper[4764]: W0320 15:10:23.568692 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b455533_d0f1_4672_b0c2_4a773e2caeb1.slice/crio-ba441075566c2005d92bfe772d5829d5538bdc4415de146ba219c5ee6a787975 WatchSource:0}: Error finding container ba441075566c2005d92bfe772d5829d5538bdc4415de146ba219c5ee6a787975: Status 404 returned error can't find the container with id ba441075566c2005d92bfe772d5829d5538bdc4415de146ba219c5ee6a787975 Mar 20 15:10:23 crc kubenswrapper[4764]: I0320 15:10:23.918653 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.099968 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.157193 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-ng5fq"] Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.281775 4764 generic.go:334] "Generic (PLEG): container finished" podID="02550cd6-b0c3-4f74-a6d2-c9348fc00cc5" containerID="853ba1e40eef674d7ca5d25caa2ed6c88027ba1617c01c85ff4c46bff5d4b2dc" exitCode=0 Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.281839 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5","Type":"ContainerDied","Data":"853ba1e40eef674d7ca5d25caa2ed6c88027ba1617c01c85ff4c46bff5d4b2dc"} Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.284997 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef","Type":"ContainerStarted","Data":"e390287ebc8432a5dffd564f790d4d5beccb33fb0c8ba42b161296e48d94ed9d"} Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.285057 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6a34707a-c81b-4987-b7cb-59ad7b8fa2ef","Type":"ContainerStarted","Data":"fd6c5c6ab2d86f83932d69c989a1d40473d3dbcc3425564a9abe01425661ad94"} Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.285109 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.288910 4764 generic.go:334] "Generic (PLEG): container finished" podID="8b455533-d0f1-4672-b0c2-4a773e2caeb1" containerID="f830207a2cea7607a6a7a4beb77d97480a3736866a04fabfe3c4c06b2d5ff701" exitCode=0 Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.288957 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-l9l9w" event={"ID":"8b455533-d0f1-4672-b0c2-4a773e2caeb1","Type":"ContainerDied","Data":"f830207a2cea7607a6a7a4beb77d97480a3736866a04fabfe3c4c06b2d5ff701"} Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.289119 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-l9l9w" event={"ID":"8b455533-d0f1-4672-b0c2-4a773e2caeb1","Type":"ContainerStarted","Data":"ba441075566c2005d92bfe772d5829d5538bdc4415de146ba219c5ee6a787975"} Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.289340 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" podUID="7b1a9bbb-2574-4580-bad4-631e401e074f" containerName="dnsmasq-dns" containerID="cri-o://8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8" gracePeriod=10 Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.335592 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.60003636 podStartE2EDuration="4.335574884s" podCreationTimestamp="2026-03-20 15:10:20 +0000 UTC" firstStartedPulling="2026-03-20 15:10:21.305667893 +0000 UTC m=+1142.921857032" lastFinishedPulling="2026-03-20 15:10:23.041206427 +0000 UTC m=+1144.657395556" observedRunningTime="2026-03-20 15:10:24.33225281 +0000 UTC m=+1145.948441939" watchObservedRunningTime="2026-03-20 15:10:24.335574884 +0000 UTC m=+1145.951764013" Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.611020 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.707712 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-ovsdbserver-sb\") pod \"7b1a9bbb-2574-4580-bad4-631e401e074f\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.708012 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-dns-svc\") pod \"7b1a9bbb-2574-4580-bad4-631e401e074f\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.708083 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ggpf\" (UniqueName: \"kubernetes.io/projected/7b1a9bbb-2574-4580-bad4-631e401e074f-kube-api-access-8ggpf\") pod \"7b1a9bbb-2574-4580-bad4-631e401e074f\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.708190 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-config\") pod \"7b1a9bbb-2574-4580-bad4-631e401e074f\" (UID: \"7b1a9bbb-2574-4580-bad4-631e401e074f\") " Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.712230 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b1a9bbb-2574-4580-bad4-631e401e074f-kube-api-access-8ggpf" (OuterVolumeSpecName: "kube-api-access-8ggpf") pod "7b1a9bbb-2574-4580-bad4-631e401e074f" (UID: "7b1a9bbb-2574-4580-bad4-631e401e074f"). InnerVolumeSpecName "kube-api-access-8ggpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.742055 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-config" (OuterVolumeSpecName: "config") pod "7b1a9bbb-2574-4580-bad4-631e401e074f" (UID: "7b1a9bbb-2574-4580-bad4-631e401e074f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.742072 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b1a9bbb-2574-4580-bad4-631e401e074f" (UID: "7b1a9bbb-2574-4580-bad4-631e401e074f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.748945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7b1a9bbb-2574-4580-bad4-631e401e074f" (UID: "7b1a9bbb-2574-4580-bad4-631e401e074f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.810034 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.810078 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.810094 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ggpf\" (UniqueName: \"kubernetes.io/projected/7b1a9bbb-2574-4580-bad4-631e401e074f-kube-api-access-8ggpf\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:24 crc kubenswrapper[4764]: I0320 15:10:24.810107 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b1a9bbb-2574-4580-bad4-631e401e074f-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.310905 4764 generic.go:334] "Generic (PLEG): container finished" podID="7b1a9bbb-2574-4580-bad4-631e401e074f" containerID="8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8" exitCode=0 Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.311026 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.311012 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" event={"ID":"7b1a9bbb-2574-4580-bad4-631e401e074f","Type":"ContainerDied","Data":"8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8"} Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.311449 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-ng5fq" event={"ID":"7b1a9bbb-2574-4580-bad4-631e401e074f","Type":"ContainerDied","Data":"757d392ec21ecd8a74e3ccbe4f5aa66696cf0488eaff1108722598a6a07607b8"} Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.311492 4764 scope.go:117] "RemoveContainer" containerID="8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.314634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"02550cd6-b0c3-4f74-a6d2-c9348fc00cc5","Type":"ContainerStarted","Data":"aca30365ab307fa0bb26b20f9bd27bc093ab7f2c5f612a42d5bdc30acf6e352b"} Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.338715 4764 scope.go:117] "RemoveContainer" containerID="0e13b9b2d81ac06521c60f69ef4f304605c1f7f78d6f88abc68cd82038773b41" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.346337 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-ng5fq"] Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.353680 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-ng5fq"] Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.367424 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371992.48737 podStartE2EDuration="44.367405668s" podCreationTimestamp="2026-03-20 15:09:41 +0000 UTC" firstStartedPulling="2026-03-20 15:09:46.84735558 +0000 UTC m=+1108.463544729" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:10:25.360057429 +0000 UTC m=+1146.976246588" watchObservedRunningTime="2026-03-20 15:10:25.367405668 +0000 UTC m=+1146.983594807" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.377283 4764 scope.go:117] "RemoveContainer" containerID="8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8" Mar 20 15:10:25 crc kubenswrapper[4764]: E0320 15:10:25.377846 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8\": container with ID starting with 8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8 not found: ID does not exist" containerID="8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.377911 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8"} err="failed to get container status \"8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8\": rpc error: code = NotFound desc = could not find container \"8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8\": container with ID starting with 8b9b9e4936bd9493c54a3bbe7b86dd10d465f3fed512270fa40a7a9b2955add8 not found: ID does not exist" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.377951 4764 scope.go:117] "RemoveContainer" containerID="0e13b9b2d81ac06521c60f69ef4f304605c1f7f78d6f88abc68cd82038773b41" Mar 20 15:10:25 crc kubenswrapper[4764]: E0320 15:10:25.378358 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e13b9b2d81ac06521c60f69ef4f304605c1f7f78d6f88abc68cd82038773b41\": container with ID starting with 0e13b9b2d81ac06521c60f69ef4f304605c1f7f78d6f88abc68cd82038773b41 not found: ID does not exist" containerID="0e13b9b2d81ac06521c60f69ef4f304605c1f7f78d6f88abc68cd82038773b41" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.378441 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e13b9b2d81ac06521c60f69ef4f304605c1f7f78d6f88abc68cd82038773b41"} err="failed to get container status \"0e13b9b2d81ac06521c60f69ef4f304605c1f7f78d6f88abc68cd82038773b41\": rpc error: code = NotFound desc = could not find container \"0e13b9b2d81ac06521c60f69ef4f304605c1f7f78d6f88abc68cd82038773b41\": container with ID starting with 0e13b9b2d81ac06521c60f69ef4f304605c1f7f78d6f88abc68cd82038773b41 not found: ID does not exist" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.671283 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-l9l9w" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.724910 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4glv6\" (UniqueName: \"kubernetes.io/projected/8b455533-d0f1-4672-b0c2-4a773e2caeb1-kube-api-access-4glv6\") pod \"8b455533-d0f1-4672-b0c2-4a773e2caeb1\" (UID: \"8b455533-d0f1-4672-b0c2-4a773e2caeb1\") " Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.725052 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b455533-d0f1-4672-b0c2-4a773e2caeb1-operator-scripts\") pod \"8b455533-d0f1-4672-b0c2-4a773e2caeb1\" (UID: \"8b455533-d0f1-4672-b0c2-4a773e2caeb1\") " Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.725887 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b455533-d0f1-4672-b0c2-4a773e2caeb1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b455533-d0f1-4672-b0c2-4a773e2caeb1" (UID: "8b455533-d0f1-4672-b0c2-4a773e2caeb1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.731463 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b455533-d0f1-4672-b0c2-4a773e2caeb1-kube-api-access-4glv6" (OuterVolumeSpecName: "kube-api-access-4glv6") pod "8b455533-d0f1-4672-b0c2-4a773e2caeb1" (UID: "8b455533-d0f1-4672-b0c2-4a773e2caeb1"). InnerVolumeSpecName "kube-api-access-4glv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.826721 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b455533-d0f1-4672-b0c2-4a773e2caeb1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:25 crc kubenswrapper[4764]: I0320 15:10:25.826770 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4glv6\" (UniqueName: \"kubernetes.io/projected/8b455533-d0f1-4672-b0c2-4a773e2caeb1-kube-api-access-4glv6\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.328265 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-l9l9w" event={"ID":"8b455533-d0f1-4672-b0c2-4a773e2caeb1","Type":"ContainerDied","Data":"ba441075566c2005d92bfe772d5829d5538bdc4415de146ba219c5ee6a787975"} Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.328317 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba441075566c2005d92bfe772d5829d5538bdc4415de146ba219c5ee6a787975" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.328336 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-l9l9w" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.763808 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-2jc7b"] Mar 20 15:10:26 crc kubenswrapper[4764]: E0320 15:10:26.764522 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b455533-d0f1-4672-b0c2-4a773e2caeb1" containerName="mariadb-account-create-update" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.764543 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b455533-d0f1-4672-b0c2-4a773e2caeb1" containerName="mariadb-account-create-update" Mar 20 15:10:26 crc kubenswrapper[4764]: E0320 15:10:26.764586 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1a9bbb-2574-4580-bad4-631e401e074f" containerName="init" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.764594 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1a9bbb-2574-4580-bad4-631e401e074f" containerName="init" Mar 20 15:10:26 crc kubenswrapper[4764]: E0320 15:10:26.764620 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1a9bbb-2574-4580-bad4-631e401e074f" containerName="dnsmasq-dns" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.764628 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1a9bbb-2574-4580-bad4-631e401e074f" containerName="dnsmasq-dns" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.764834 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b455533-d0f1-4672-b0c2-4a773e2caeb1" containerName="mariadb-account-create-update" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.764864 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b1a9bbb-2574-4580-bad4-631e401e074f" containerName="dnsmasq-dns" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.765901 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.774261 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2jc7b"] Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.843359 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjlzj\" (UniqueName: \"kubernetes.io/projected/dd895e6c-1154-42f9-8490-b3832a9f815e-kube-api-access-rjlzj\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.843427 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.843487 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.843683 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-dns-svc\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.843735 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-config\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.944951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-dns-svc\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.944995 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-config\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.945064 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjlzj\" (UniqueName: \"kubernetes.io/projected/dd895e6c-1154-42f9-8490-b3832a9f815e-kube-api-access-rjlzj\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.945081 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.945137 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.945997 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.946560 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-dns-svc\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.947476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-config\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.947805 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:26 crc kubenswrapper[4764]: I0320 15:10:26.963734 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjlzj\" (UniqueName: \"kubernetes.io/projected/dd895e6c-1154-42f9-8490-b3832a9f815e-kube-api-access-rjlzj\") pod \"dnsmasq-dns-698758b865-2jc7b\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.090268 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.139007 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b1a9bbb-2574-4580-bad4-631e401e074f" path="/var/lib/kubelet/pods/7b1a9bbb-2574-4580-bad4-631e401e074f/volumes" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.381426 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2jc7b"] Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.877784 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.884792 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.887313 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.888217 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.888538 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.891483 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-stg9r" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.919358 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.970419 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.970459 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-cache\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.970624 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.970922 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.971011 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwm4p\" (UniqueName: \"kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-kube-api-access-hwm4p\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:27 crc kubenswrapper[4764]: I0320 15:10:27.971164 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-lock\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.072987 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-lock\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.073342 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.073364 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-cache\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.073517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: E0320 15:10:28.073582 4764 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 15:10:28 crc kubenswrapper[4764]: E0320 15:10:28.073609 4764 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 15:10:28 crc kubenswrapper[4764]: E0320 15:10:28.073674 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift podName:d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0 nodeName:}" failed. No retries permitted until 2026-03-20 15:10:28.573655454 +0000 UTC m=+1150.189844583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift") pod "swift-storage-0" (UID: "d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0") : configmap "swift-ring-files" not found Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.073738 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-lock\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.074004 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.074103 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-cache\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.074143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwm4p\" (UniqueName: \"kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-kube-api-access-hwm4p\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.074308 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.080063 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.098456 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwm4p\" (UniqueName: \"kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-kube-api-access-hwm4p\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.104548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.350913 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2jc7b" event={"ID":"dd895e6c-1154-42f9-8490-b3832a9f815e","Type":"ContainerStarted","Data":"c1bdff07276dc312b4ea36ad411d5b6db0e534267f9da853550ca49e6e0cd673"} Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.350973 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2jc7b" event={"ID":"dd895e6c-1154-42f9-8490-b3832a9f815e","Type":"ContainerStarted","Data":"63952f0baeb758d9fbdbdea744b6c891b5afbd8efa3ca1712d309ca30640511c"} Mar 20 15:10:28 crc kubenswrapper[4764]: I0320 15:10:28.583803 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:28 crc kubenswrapper[4764]: E0320 15:10:28.583993 4764 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 15:10:28 crc kubenswrapper[4764]: E0320 15:10:28.584025 4764 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 15:10:28 crc kubenswrapper[4764]: E0320 15:10:28.584089 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift podName:d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0 nodeName:}" failed. No retries permitted until 2026-03-20 15:10:29.584070365 +0000 UTC m=+1151.200259494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift") pod "swift-storage-0" (UID: "d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0") : configmap "swift-ring-files" not found Mar 20 15:10:29 crc kubenswrapper[4764]: I0320 15:10:29.359650 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd895e6c-1154-42f9-8490-b3832a9f815e" containerID="c1bdff07276dc312b4ea36ad411d5b6db0e534267f9da853550ca49e6e0cd673" exitCode=0 Mar 20 15:10:29 crc kubenswrapper[4764]: I0320 15:10:29.360011 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2jc7b" event={"ID":"dd895e6c-1154-42f9-8490-b3832a9f815e","Type":"ContainerDied","Data":"c1bdff07276dc312b4ea36ad411d5b6db0e534267f9da853550ca49e6e0cd673"} Mar 20 15:10:29 crc kubenswrapper[4764]: I0320 15:10:29.601234 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:29 crc kubenswrapper[4764]: E0320 15:10:29.601451 4764 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 15:10:29 crc kubenswrapper[4764]: E0320 15:10:29.601487 4764 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 15:10:29 crc kubenswrapper[4764]: E0320 15:10:29.601544 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift podName:d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0 nodeName:}" failed. No retries permitted until 2026-03-20 15:10:31.601526921 +0000 UTC m=+1153.217716050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift") pod "swift-storage-0" (UID: "d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0") : configmap "swift-ring-files" not found Mar 20 15:10:30 crc kubenswrapper[4764]: I0320 15:10:30.372600 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2jc7b" event={"ID":"dd895e6c-1154-42f9-8490-b3832a9f815e","Type":"ContainerStarted","Data":"b5f0eb3eac9afc6384858b2bbbbaf2ec68d8c0a38d04117b016885c0de25ad70"} Mar 20 15:10:30 crc kubenswrapper[4764]: I0320 15:10:30.373360 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:30 crc kubenswrapper[4764]: I0320 15:10:30.403737 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-2jc7b" podStartSLOduration=4.403713735 podStartE2EDuration="4.403713735s" podCreationTimestamp="2026-03-20 15:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:10:30.400585947 +0000 UTC m=+1152.016775116" watchObservedRunningTime="2026-03-20 15:10:30.403713735 +0000 UTC m=+1152.019902884" Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.637435 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:31 crc kubenswrapper[4764]: E0320 15:10:31.637692 4764 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 15:10:31 crc kubenswrapper[4764]: E0320 15:10:31.637734 4764 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 15:10:31 crc kubenswrapper[4764]: E0320 15:10:31.637807 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift podName:d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0 nodeName:}" failed. No retries permitted until 2026-03-20 15:10:35.637783866 +0000 UTC m=+1157.253972995 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift") pod "swift-storage-0" (UID: "d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0") : configmap "swift-ring-files" not found Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.844042 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-fqbvt"] Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.845538 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.848304 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.848310 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.848564 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.861101 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-fqbvt"] Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.942882 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-scripts\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.943473 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-etc-swift\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.943575 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-dispersionconf\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.943724 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-combined-ca-bundle\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.943860 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-ring-data-devices\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.943996 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-swiftconf\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:31 crc kubenswrapper[4764]: I0320 15:10:31.944102 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2th5\" (UniqueName: \"kubernetes.io/projected/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-kube-api-access-b2th5\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.046022 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-scripts\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.046406 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-etc-swift\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.046585 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-dispersionconf\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.046716 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-combined-ca-bundle\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.046827 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-scripts\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.046960 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-ring-data-devices\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.047071 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-etc-swift\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.047238 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-swiftconf\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.047350 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2th5\" (UniqueName: \"kubernetes.io/projected/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-kube-api-access-b2th5\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.047886 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-ring-data-devices\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.052413 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-swiftconf\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.052528 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-dispersionconf\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.053030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-combined-ca-bundle\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.064955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2th5\" (UniqueName: \"kubernetes.io/projected/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-kube-api-access-b2th5\") pod \"swift-ring-rebalance-fqbvt\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.166366 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.596640 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-fqbvt"] Mar 20 15:10:32 crc kubenswrapper[4764]: W0320 15:10:32.598595 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14bf0f11_9be0_4cd6_9395_a9c2d4e12706.slice/crio-4bf3a6dcab4c2969b6afd660b2f690e1d91d3933ebd7c0287cda93374094e78b WatchSource:0}: Error finding container 4bf3a6dcab4c2969b6afd660b2f690e1d91d3933ebd7c0287cda93374094e78b: Status 404 returned error can't find the container with id 4bf3a6dcab4c2969b6afd660b2f690e1d91d3933ebd7c0287cda93374094e78b Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.921989 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 15:10:32 crc kubenswrapper[4764]: I0320 15:10:32.922164 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 15:10:33 crc kubenswrapper[4764]: I0320 15:10:33.002450 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 15:10:33 crc kubenswrapper[4764]: I0320 15:10:33.399420 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fqbvt" event={"ID":"14bf0f11-9be0-4cd6-9395-a9c2d4e12706","Type":"ContainerStarted","Data":"4bf3a6dcab4c2969b6afd660b2f690e1d91d3933ebd7c0287cda93374094e78b"} Mar 20 15:10:33 crc kubenswrapper[4764]: I0320 15:10:33.503538 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.411355 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-74vjq"] Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.416042 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-74vjq" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.429701 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-74vjq"] Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.521254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2-operator-scripts\") pod \"keystone-db-create-74vjq\" (UID: \"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2\") " pod="openstack/keystone-db-create-74vjq" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.521516 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vdjh\" (UniqueName: \"kubernetes.io/projected/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2-kube-api-access-8vdjh\") pod \"keystone-db-create-74vjq\" (UID: \"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2\") " pod="openstack/keystone-db-create-74vjq" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.537050 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-44ab-account-create-update-mkdlb"] Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.547291 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-44ab-account-create-update-mkdlb" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.550233 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.551971 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-44ab-account-create-update-mkdlb"] Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.625563 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2-operator-scripts\") pod \"keystone-db-create-74vjq\" (UID: \"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2\") " pod="openstack/keystone-db-create-74vjq" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.625668 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vdjh\" (UniqueName: \"kubernetes.io/projected/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2-kube-api-access-8vdjh\") pod \"keystone-db-create-74vjq\" (UID: \"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2\") " pod="openstack/keystone-db-create-74vjq" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.625762 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf7vl\" (UniqueName: \"kubernetes.io/projected/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe-kube-api-access-bf7vl\") pod \"keystone-44ab-account-create-update-mkdlb\" (UID: \"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe\") " pod="openstack/keystone-44ab-account-create-update-mkdlb" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.625785 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe-operator-scripts\") pod \"keystone-44ab-account-create-update-mkdlb\" (UID: \"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe\") " pod="openstack/keystone-44ab-account-create-update-mkdlb" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.627207 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2-operator-scripts\") pod \"keystone-db-create-74vjq\" (UID: \"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2\") " pod="openstack/keystone-db-create-74vjq" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.629222 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-7pbv8"] Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.631683 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7pbv8" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.650022 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vdjh\" (UniqueName: \"kubernetes.io/projected/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2-kube-api-access-8vdjh\") pod \"keystone-db-create-74vjq\" (UID: \"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2\") " pod="openstack/keystone-db-create-74vjq" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.654325 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7pbv8"] Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.727651 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aec8fd1c-0dd8-48d3-8194-5d74198c652e-operator-scripts\") pod \"placement-db-create-7pbv8\" (UID: \"aec8fd1c-0dd8-48d3-8194-5d74198c652e\") " pod="openstack/placement-db-create-7pbv8" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.727761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf7vl\" (UniqueName: \"kubernetes.io/projected/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe-kube-api-access-bf7vl\") pod \"keystone-44ab-account-create-update-mkdlb\" (UID: \"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe\") " pod="openstack/keystone-44ab-account-create-update-mkdlb" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.727802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe-operator-scripts\") pod \"keystone-44ab-account-create-update-mkdlb\" (UID: \"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe\") " pod="openstack/keystone-44ab-account-create-update-mkdlb" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.727844 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.727937 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t4r7\" (UniqueName: \"kubernetes.io/projected/aec8fd1c-0dd8-48d3-8194-5d74198c652e-kube-api-access-8t4r7\") pod \"placement-db-create-7pbv8\" (UID: \"aec8fd1c-0dd8-48d3-8194-5d74198c652e\") " pod="openstack/placement-db-create-7pbv8" Mar 20 15:10:35 crc kubenswrapper[4764]: E0320 15:10:35.728476 4764 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 15:10:35 crc kubenswrapper[4764]: E0320 15:10:35.728560 4764 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 15:10:35 crc kubenswrapper[4764]: E0320 15:10:35.728730 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift podName:d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0 nodeName:}" failed. No retries permitted until 2026-03-20 15:10:43.728658097 +0000 UTC m=+1165.344847266 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift") pod "swift-storage-0" (UID: "d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0") : configmap "swift-ring-files" not found Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.728919 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe-operator-scripts\") pod \"keystone-44ab-account-create-update-mkdlb\" (UID: \"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe\") " pod="openstack/keystone-44ab-account-create-update-mkdlb" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.733210 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c907-account-create-update-flnnn"] Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.734542 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c907-account-create-update-flnnn" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.737369 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.744752 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c907-account-create-update-flnnn"] Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.753976 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf7vl\" (UniqueName: \"kubernetes.io/projected/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe-kube-api-access-bf7vl\") pod \"keystone-44ab-account-create-update-mkdlb\" (UID: \"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe\") " pod="openstack/keystone-44ab-account-create-update-mkdlb" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.755724 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-74vjq" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.830855 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4206356-ff23-4e94-b1bb-d27749ca895d-operator-scripts\") pod \"placement-c907-account-create-update-flnnn\" (UID: \"f4206356-ff23-4e94-b1bb-d27749ca895d\") " pod="openstack/placement-c907-account-create-update-flnnn" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.831201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t4r7\" (UniqueName: \"kubernetes.io/projected/aec8fd1c-0dd8-48d3-8194-5d74198c652e-kube-api-access-8t4r7\") pod \"placement-db-create-7pbv8\" (UID: \"aec8fd1c-0dd8-48d3-8194-5d74198c652e\") " pod="openstack/placement-db-create-7pbv8" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.831238 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkqfv\" (UniqueName: \"kubernetes.io/projected/f4206356-ff23-4e94-b1bb-d27749ca895d-kube-api-access-wkqfv\") pod \"placement-c907-account-create-update-flnnn\" (UID: \"f4206356-ff23-4e94-b1bb-d27749ca895d\") " pod="openstack/placement-c907-account-create-update-flnnn" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.831341 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aec8fd1c-0dd8-48d3-8194-5d74198c652e-operator-scripts\") pod \"placement-db-create-7pbv8\" (UID: \"aec8fd1c-0dd8-48d3-8194-5d74198c652e\") " pod="openstack/placement-db-create-7pbv8" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.832185 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aec8fd1c-0dd8-48d3-8194-5d74198c652e-operator-scripts\") pod \"placement-db-create-7pbv8\" (UID: \"aec8fd1c-0dd8-48d3-8194-5d74198c652e\") " pod="openstack/placement-db-create-7pbv8" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.858441 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t4r7\" (UniqueName: \"kubernetes.io/projected/aec8fd1c-0dd8-48d3-8194-5d74198c652e-kube-api-access-8t4r7\") pod \"placement-db-create-7pbv8\" (UID: \"aec8fd1c-0dd8-48d3-8194-5d74198c652e\") " pod="openstack/placement-db-create-7pbv8" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.880004 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-44ab-account-create-update-mkdlb" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.932020 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4206356-ff23-4e94-b1bb-d27749ca895d-operator-scripts\") pod \"placement-c907-account-create-update-flnnn\" (UID: \"f4206356-ff23-4e94-b1bb-d27749ca895d\") " pod="openstack/placement-c907-account-create-update-flnnn" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.932110 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkqfv\" (UniqueName: \"kubernetes.io/projected/f4206356-ff23-4e94-b1bb-d27749ca895d-kube-api-access-wkqfv\") pod \"placement-c907-account-create-update-flnnn\" (UID: \"f4206356-ff23-4e94-b1bb-d27749ca895d\") " pod="openstack/placement-c907-account-create-update-flnnn" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.933464 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4206356-ff23-4e94-b1bb-d27749ca895d-operator-scripts\") pod \"placement-c907-account-create-update-flnnn\" (UID: \"f4206356-ff23-4e94-b1bb-d27749ca895d\") " pod="openstack/placement-c907-account-create-update-flnnn" Mar 20 15:10:35 crc kubenswrapper[4764]: I0320 15:10:35.952412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkqfv\" (UniqueName: \"kubernetes.io/projected/f4206356-ff23-4e94-b1bb-d27749ca895d-kube-api-access-wkqfv\") pod \"placement-c907-account-create-update-flnnn\" (UID: \"f4206356-ff23-4e94-b1bb-d27749ca895d\") " pod="openstack/placement-c907-account-create-update-flnnn" Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.014673 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7pbv8" Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.050559 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c907-account-create-update-flnnn" Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.207671 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-74vjq"] Mar 20 15:10:36 crc kubenswrapper[4764]: W0320 15:10:36.213504 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9ac66a3_3f0d_40c9_94d4_01d3c8d91de2.slice/crio-d57d5560ac1d70f58f79158fa8a97a4ee1daec19c5bbf87c434457f461602af2 WatchSource:0}: Error finding container d57d5560ac1d70f58f79158fa8a97a4ee1daec19c5bbf87c434457f461602af2: Status 404 returned error can't find the container with id d57d5560ac1d70f58f79158fa8a97a4ee1daec19c5bbf87c434457f461602af2 Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.316114 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-44ab-account-create-update-mkdlb"] Mar 20 15:10:36 crc kubenswrapper[4764]: W0320 15:10:36.328695 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod178a6de3_2ce6_43d1_951b_e7b3dc5c4cbe.slice/crio-acf81a685ec208d467cf3482a4aacd7bcbfdb9f43b6ebdb99e9adeb187ca9419 WatchSource:0}: Error finding container acf81a685ec208d467cf3482a4aacd7bcbfdb9f43b6ebdb99e9adeb187ca9419: Status 404 returned error can't find the container with id acf81a685ec208d467cf3482a4aacd7bcbfdb9f43b6ebdb99e9adeb187ca9419 Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.454430 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-74vjq" event={"ID":"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2","Type":"ContainerStarted","Data":"a225621c76af64bcf5e5bb68f7f64c06c3b503f05535f0b2ef9e6f84c5f2b116"} Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.454512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-74vjq" event={"ID":"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2","Type":"ContainerStarted","Data":"d57d5560ac1d70f58f79158fa8a97a4ee1daec19c5bbf87c434457f461602af2"} Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.460736 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7pbv8"] Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.461128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-44ab-account-create-update-mkdlb" event={"ID":"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe","Type":"ContainerStarted","Data":"acf81a685ec208d467cf3482a4aacd7bcbfdb9f43b6ebdb99e9adeb187ca9419"} Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.463683 4764 generic.go:334] "Generic (PLEG): container finished" podID="b497b447-0f6a-47e6-b106-16ca68b88d44" containerID="78dbb6b14b25554bc8dfbd32d89aa8308edaafd3936db77f6a6353d6ab780c08" exitCode=0 Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.463748 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b497b447-0f6a-47e6-b106-16ca68b88d44","Type":"ContainerDied","Data":"78dbb6b14b25554bc8dfbd32d89aa8308edaafd3936db77f6a6353d6ab780c08"} Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.466742 4764 generic.go:334] "Generic (PLEG): container finished" podID="ef1499d4-3bae-40c1-882d-ad9778b9eb80" containerID="9215e43f927a95a4439a55c48e299c5c67930db52f0f8f187ffa9ea723db8651" exitCode=0 Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.466806 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef1499d4-3bae-40c1-882d-ad9778b9eb80","Type":"ContainerDied","Data":"9215e43f927a95a4439a55c48e299c5c67930db52f0f8f187ffa9ea723db8651"} Mar 20 15:10:36 crc kubenswrapper[4764]: W0320 15:10:36.471435 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaec8fd1c_0dd8_48d3_8194_5d74198c652e.slice/crio-2bba5da0a8508ad82efd57385299757885393cabd94bb00ca0fa284be99ae7df WatchSource:0}: Error finding container 2bba5da0a8508ad82efd57385299757885393cabd94bb00ca0fa284be99ae7df: Status 404 returned error can't find the container with id 2bba5da0a8508ad82efd57385299757885393cabd94bb00ca0fa284be99ae7df Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.478472 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-74vjq" podStartSLOduration=1.478361587 podStartE2EDuration="1.478361587s" podCreationTimestamp="2026-03-20 15:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:10:36.471514704 +0000 UTC m=+1158.087703833" watchObservedRunningTime="2026-03-20 15:10:36.478361587 +0000 UTC m=+1158.094550716" Mar 20 15:10:36 crc kubenswrapper[4764]: I0320 15:10:36.543022 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c907-account-create-update-flnnn"] Mar 20 15:10:37 crc kubenswrapper[4764]: I0320 15:10:37.091572 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:10:37 crc kubenswrapper[4764]: I0320 15:10:37.174703 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-77jwd"] Mar 20 15:10:37 crc kubenswrapper[4764]: I0320 15:10:37.174900 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" podUID="fb2661b6-35bf-4ad6-ad3d-b84de19786fb" containerName="dnsmasq-dns" containerID="cri-o://e498391585427d43e0214f3257a12ad85528b7e316ab0f5e2ce7e3ead20f9568" gracePeriod=10 Mar 20 15:10:37 crc kubenswrapper[4764]: I0320 15:10:37.485396 4764 generic.go:334] "Generic (PLEG): container finished" podID="fb2661b6-35bf-4ad6-ad3d-b84de19786fb" containerID="e498391585427d43e0214f3257a12ad85528b7e316ab0f5e2ce7e3ead20f9568" exitCode=0 Mar 20 15:10:37 crc kubenswrapper[4764]: I0320 15:10:37.485462 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" event={"ID":"fb2661b6-35bf-4ad6-ad3d-b84de19786fb","Type":"ContainerDied","Data":"e498391585427d43e0214f3257a12ad85528b7e316ab0f5e2ce7e3ead20f9568"} Mar 20 15:10:37 crc kubenswrapper[4764]: I0320 15:10:37.487889 4764 generic.go:334] "Generic (PLEG): container finished" podID="f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2" containerID="a225621c76af64bcf5e5bb68f7f64c06c3b503f05535f0b2ef9e6f84c5f2b116" exitCode=0 Mar 20 15:10:37 crc kubenswrapper[4764]: I0320 15:10:37.487934 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-74vjq" event={"ID":"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2","Type":"ContainerDied","Data":"a225621c76af64bcf5e5bb68f7f64c06c3b503f05535f0b2ef9e6f84c5f2b116"} Mar 20 15:10:37 crc kubenswrapper[4764]: I0320 15:10:37.492155 4764 generic.go:334] "Generic (PLEG): container finished" podID="178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe" containerID="b00c8d0ce5992a941a7b4a0bb4646136131f15c9829801c3d4f7d8d5a66a62d6" exitCode=0 Mar 20 15:10:37 crc kubenswrapper[4764]: I0320 15:10:37.492203 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-44ab-account-create-update-mkdlb" event={"ID":"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe","Type":"ContainerDied","Data":"b00c8d0ce5992a941a7b4a0bb4646136131f15c9829801c3d4f7d8d5a66a62d6"} Mar 20 15:10:37 crc kubenswrapper[4764]: I0320 15:10:37.493163 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7pbv8" event={"ID":"aec8fd1c-0dd8-48d3-8194-5d74198c652e","Type":"ContainerStarted","Data":"2bba5da0a8508ad82efd57385299757885393cabd94bb00ca0fa284be99ae7df"} Mar 20 15:10:38 crc kubenswrapper[4764]: I0320 15:10:38.443927 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:10:38 crc kubenswrapper[4764]: I0320 15:10:38.444289 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.094586 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" podUID="fb2661b6-35bf-4ad6-ad3d-b84de19786fb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.704534 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-tnk24"] Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.705844 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tnk24" Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.719510 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-tnk24"] Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.720615 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc296\" (UniqueName: \"kubernetes.io/projected/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c-kube-api-access-rc296\") pod \"glance-db-create-tnk24\" (UID: \"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c\") " pod="openstack/glance-db-create-tnk24" Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.721072 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c-operator-scripts\") pod \"glance-db-create-tnk24\" (UID: \"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c\") " pod="openstack/glance-db-create-tnk24" Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.802532 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e671-account-create-update-np4kd"] Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.803814 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e671-account-create-update-np4kd" Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.806652 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.814560 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e671-account-create-update-np4kd"] Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.826778 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc296\" (UniqueName: \"kubernetes.io/projected/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c-kube-api-access-rc296\") pod \"glance-db-create-tnk24\" (UID: \"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c\") " pod="openstack/glance-db-create-tnk24" Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.826925 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c-operator-scripts\") pod \"glance-db-create-tnk24\" (UID: \"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c\") " pod="openstack/glance-db-create-tnk24" Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.828328 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c-operator-scripts\") pod \"glance-db-create-tnk24\" (UID: \"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c\") " pod="openstack/glance-db-create-tnk24" Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.848305 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc296\" (UniqueName: \"kubernetes.io/projected/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c-kube-api-access-rc296\") pod \"glance-db-create-tnk24\" (UID: \"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c\") " pod="openstack/glance-db-create-tnk24" Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.932792 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxlwg\" (UniqueName: \"kubernetes.io/projected/7fb0f4b5-4acb-421d-9df5-ed34eed17848-kube-api-access-rxlwg\") pod \"glance-e671-account-create-update-np4kd\" (UID: \"7fb0f4b5-4acb-421d-9df5-ed34eed17848\") " pod="openstack/glance-e671-account-create-update-np4kd" Mar 20 15:10:39 crc kubenswrapper[4764]: I0320 15:10:39.932862 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb0f4b5-4acb-421d-9df5-ed34eed17848-operator-scripts\") pod \"glance-e671-account-create-update-np4kd\" (UID: \"7fb0f4b5-4acb-421d-9df5-ed34eed17848\") " pod="openstack/glance-e671-account-create-update-np4kd" Mar 20 15:10:39 crc kubenswrapper[4764]: W0320 15:10:39.941214 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4206356_ff23_4e94_b1bb_d27749ca895d.slice/crio-8d68e402fddd50f2aaa01d8ef928cd14a1642a1365bdd272701539339b4cfe4e WatchSource:0}: Error finding container 8d68e402fddd50f2aaa01d8ef928cd14a1642a1365bdd272701539339b4cfe4e: Status 404 returned error can't find the container with id 8d68e402fddd50f2aaa01d8ef928cd14a1642a1365bdd272701539339b4cfe4e Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.034829 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxlwg\" (UniqueName: \"kubernetes.io/projected/7fb0f4b5-4acb-421d-9df5-ed34eed17848-kube-api-access-rxlwg\") pod \"glance-e671-account-create-update-np4kd\" (UID: \"7fb0f4b5-4acb-421d-9df5-ed34eed17848\") " pod="openstack/glance-e671-account-create-update-np4kd" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.035071 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tnk24" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.035106 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb0f4b5-4acb-421d-9df5-ed34eed17848-operator-scripts\") pod \"glance-e671-account-create-update-np4kd\" (UID: \"7fb0f4b5-4acb-421d-9df5-ed34eed17848\") " pod="openstack/glance-e671-account-create-update-np4kd" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.035834 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb0f4b5-4acb-421d-9df5-ed34eed17848-operator-scripts\") pod \"glance-e671-account-create-update-np4kd\" (UID: \"7fb0f4b5-4acb-421d-9df5-ed34eed17848\") " pod="openstack/glance-e671-account-create-update-np4kd" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.059601 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-44ab-account-create-update-mkdlb" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.067173 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxlwg\" (UniqueName: \"kubernetes.io/projected/7fb0f4b5-4acb-421d-9df5-ed34eed17848-kube-api-access-rxlwg\") pod \"glance-e671-account-create-update-np4kd\" (UID: \"7fb0f4b5-4acb-421d-9df5-ed34eed17848\") " pod="openstack/glance-e671-account-create-update-np4kd" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.135515 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e671-account-create-update-np4kd" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.137480 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf7vl\" (UniqueName: \"kubernetes.io/projected/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe-kube-api-access-bf7vl\") pod \"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe\" (UID: \"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe\") " Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.137625 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe-operator-scripts\") pod \"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe\" (UID: \"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe\") " Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.139011 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe" (UID: "178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.141050 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-74vjq" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.141162 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe-kube-api-access-bf7vl" (OuterVolumeSpecName: "kube-api-access-bf7vl") pod "178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe" (UID: "178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe"). InnerVolumeSpecName "kube-api-access-bf7vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.239735 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2-operator-scripts\") pod \"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2\" (UID: \"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2\") " Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.240066 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vdjh\" (UniqueName: \"kubernetes.io/projected/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2-kube-api-access-8vdjh\") pod \"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2\" (UID: \"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2\") " Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.240070 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2" (UID: "f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.240504 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.240519 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf7vl\" (UniqueName: \"kubernetes.io/projected/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe-kube-api-access-bf7vl\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.240528 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.242553 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ncp4w" podUID="cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d" containerName="ovn-controller" probeResult="failure" output=< Mar 20 15:10:40 crc kubenswrapper[4764]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 15:10:40 crc kubenswrapper[4764]: > Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.254209 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2-kube-api-access-8vdjh" (OuterVolumeSpecName: "kube-api-access-8vdjh") pod "f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2" (UID: "f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2"). InnerVolumeSpecName "kube-api-access-8vdjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.279505 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.343778 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-ovsdbserver-nb\") pod \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.343852 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-config\") pod \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.343874 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-ovsdbserver-sb\") pod \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.343982 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-dns-svc\") pod \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.344035 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nr9s\" (UniqueName: \"kubernetes.io/projected/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-kube-api-access-6nr9s\") pod \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\" (UID: \"fb2661b6-35bf-4ad6-ad3d-b84de19786fb\") " Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.344402 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vdjh\" (UniqueName: \"kubernetes.io/projected/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2-kube-api-access-8vdjh\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.353118 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-kube-api-access-6nr9s" (OuterVolumeSpecName: "kube-api-access-6nr9s") pod "fb2661b6-35bf-4ad6-ad3d-b84de19786fb" (UID: "fb2661b6-35bf-4ad6-ad3d-b84de19786fb"). InnerVolumeSpecName "kube-api-access-6nr9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.385092 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb2661b6-35bf-4ad6-ad3d-b84de19786fb" (UID: "fb2661b6-35bf-4ad6-ad3d-b84de19786fb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.385105 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb2661b6-35bf-4ad6-ad3d-b84de19786fb" (UID: "fb2661b6-35bf-4ad6-ad3d-b84de19786fb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.387202 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb2661b6-35bf-4ad6-ad3d-b84de19786fb" (UID: "fb2661b6-35bf-4ad6-ad3d-b84de19786fb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.393316 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-config" (OuterVolumeSpecName: "config") pod "fb2661b6-35bf-4ad6-ad3d-b84de19786fb" (UID: "fb2661b6-35bf-4ad6-ad3d-b84de19786fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.446204 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.446232 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.446240 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.446249 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.446258 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nr9s\" (UniqueName: \"kubernetes.io/projected/fb2661b6-35bf-4ad6-ad3d-b84de19786fb-kube-api-access-6nr9s\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.522294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7pbv8" event={"ID":"aec8fd1c-0dd8-48d3-8194-5d74198c652e","Type":"ContainerStarted","Data":"0bae5de10238a7beb8f0a7f3802a329a4df5969222160262a576adb89cbbfcdf"} Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.533757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b497b447-0f6a-47e6-b106-16ca68b88d44","Type":"ContainerStarted","Data":"bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4"} Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.534032 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.549439 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-tnk24"] Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.550038 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c907-account-create-update-flnnn" event={"ID":"f4206356-ff23-4e94-b1bb-d27749ca895d","Type":"ContainerStarted","Data":"9248544ce2256e2693b971c6047c97c5fc06d849b130ad0e5fc7083448a0d67c"} Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.550079 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c907-account-create-update-flnnn" event={"ID":"f4206356-ff23-4e94-b1bb-d27749ca895d","Type":"ContainerStarted","Data":"8d68e402fddd50f2aaa01d8ef928cd14a1642a1365bdd272701539339b4cfe4e"} Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.557297 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-7pbv8" podStartSLOduration=5.557262455 podStartE2EDuration="5.557262455s" podCreationTimestamp="2026-03-20 15:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:10:40.541816884 +0000 UTC m=+1162.158006013" watchObservedRunningTime="2026-03-20 15:10:40.557262455 +0000 UTC m=+1162.173451584" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.559587 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef1499d4-3bae-40c1-882d-ad9778b9eb80","Type":"ContainerStarted","Data":"188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5"} Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.559865 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.561371 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" event={"ID":"fb2661b6-35bf-4ad6-ad3d-b84de19786fb","Type":"ContainerDied","Data":"7f6bd52133c8536e48f15ca9209c0e240e0343d1fa494d3a87252d6f2e7487cc"} Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.561422 4764 scope.go:117] "RemoveContainer" containerID="e498391585427d43e0214f3257a12ad85528b7e316ab0f5e2ce7e3ead20f9568" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.561520 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-77jwd" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.564794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fqbvt" event={"ID":"14bf0f11-9be0-4cd6-9395-a9c2d4e12706","Type":"ContainerStarted","Data":"b2cb3dae89d9b4b16d2e829aa0fe421ead5c722733f85d94448cc7255790f662"} Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.570854 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.874942908 podStartE2EDuration="1m0.570839177s" podCreationTimestamp="2026-03-20 15:09:40 +0000 UTC" firstStartedPulling="2026-03-20 15:09:42.010147084 +0000 UTC m=+1103.626336203" lastFinishedPulling="2026-03-20 15:10:02.706043353 +0000 UTC m=+1124.322232472" observedRunningTime="2026-03-20 15:10:40.564484439 +0000 UTC m=+1162.180673568" watchObservedRunningTime="2026-03-20 15:10:40.570839177 +0000 UTC m=+1162.187028306" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.571164 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-74vjq" event={"ID":"f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2","Type":"ContainerDied","Data":"d57d5560ac1d70f58f79158fa8a97a4ee1daec19c5bbf87c434457f461602af2"} Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.571203 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d57d5560ac1d70f58f79158fa8a97a4ee1daec19c5bbf87c434457f461602af2" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.571177 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-74vjq" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.572363 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-44ab-account-create-update-mkdlb" event={"ID":"178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe","Type":"ContainerDied","Data":"acf81a685ec208d467cf3482a4aacd7bcbfdb9f43b6ebdb99e9adeb187ca9419"} Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.572408 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acf81a685ec208d467cf3482a4aacd7bcbfdb9f43b6ebdb99e9adeb187ca9419" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.572448 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-44ab-account-create-update-mkdlb" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.587948 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c907-account-create-update-flnnn" podStartSLOduration=5.587932449 podStartE2EDuration="5.587932449s" podCreationTimestamp="2026-03-20 15:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:10:40.585172793 +0000 UTC m=+1162.201361922" watchObservedRunningTime="2026-03-20 15:10:40.587932449 +0000 UTC m=+1162.204121578" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.613314 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.009245754 podStartE2EDuration="1m0.613293377s" podCreationTimestamp="2026-03-20 15:09:40 +0000 UTC" firstStartedPulling="2026-03-20 15:09:42.262503011 +0000 UTC m=+1103.878692140" lastFinishedPulling="2026-03-20 15:10:02.866550624 +0000 UTC m=+1124.482739763" observedRunningTime="2026-03-20 15:10:40.603232984 +0000 UTC m=+1162.219422113" watchObservedRunningTime="2026-03-20 15:10:40.613293377 +0000 UTC m=+1162.229482506" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.622236 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-fqbvt" podStartSLOduration=2.168242994 podStartE2EDuration="9.622222375s" podCreationTimestamp="2026-03-20 15:10:31 +0000 UTC" firstStartedPulling="2026-03-20 15:10:32.601568844 +0000 UTC m=+1154.217757983" lastFinishedPulling="2026-03-20 15:10:40.055548215 +0000 UTC m=+1161.671737364" observedRunningTime="2026-03-20 15:10:40.619867641 +0000 UTC m=+1162.236056770" watchObservedRunningTime="2026-03-20 15:10:40.622222375 +0000 UTC m=+1162.238411504" Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.644375 4764 scope.go:117] "RemoveContainer" containerID="6191a234239791e6c54e4e0eaddf3d22537fbed1c16edc4ceff26570970df997" Mar 20 15:10:40 crc kubenswrapper[4764]: W0320 15:10:40.651673 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd636f2ec_b9c4_4d44_be52_c1c6a570bf4c.slice/crio-9205eadc68e5766b0d3a5443adc83cac4092f99473a9c77a26c39ff1e064d8e3 WatchSource:0}: Error finding container 9205eadc68e5766b0d3a5443adc83cac4092f99473a9c77a26c39ff1e064d8e3: Status 404 returned error can't find the container with id 9205eadc68e5766b0d3a5443adc83cac4092f99473a9c77a26c39ff1e064d8e3 Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.677532 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-77jwd"] Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.684738 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-77jwd"] Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.705059 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e671-account-create-update-np4kd"] Mar 20 15:10:40 crc kubenswrapper[4764]: W0320 15:10:40.710541 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fb0f4b5_4acb_421d_9df5_ed34eed17848.slice/crio-c65c5baad9f2016af10333eea30731a525d7f45ed906b465b3ad874c43f97360 WatchSource:0}: Error finding container c65c5baad9f2016af10333eea30731a525d7f45ed906b465b3ad874c43f97360: Status 404 returned error can't find the container with id c65c5baad9f2016af10333eea30731a525d7f45ed906b465b3ad874c43f97360 Mar 20 15:10:40 crc kubenswrapper[4764]: I0320 15:10:40.846519 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.162268 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb2661b6-35bf-4ad6-ad3d-b84de19786fb" path="/var/lib/kubelet/pods/fb2661b6-35bf-4ad6-ad3d-b84de19786fb/volumes" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.551707 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-l9l9w"] Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.559220 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-l9l9w"] Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.581902 4764 generic.go:334] "Generic (PLEG): container finished" podID="d636f2ec-b9c4-4d44-be52-c1c6a570bf4c" containerID="507773a7464ac1ca2752ad7b1657c72617f1df223657774bcac59a0229b47717" exitCode=0 Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.581964 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tnk24" event={"ID":"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c","Type":"ContainerDied","Data":"507773a7464ac1ca2752ad7b1657c72617f1df223657774bcac59a0229b47717"} Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.582429 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tnk24" event={"ID":"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c","Type":"ContainerStarted","Data":"9205eadc68e5766b0d3a5443adc83cac4092f99473a9c77a26c39ff1e064d8e3"} Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.586505 4764 generic.go:334] "Generic (PLEG): container finished" podID="aec8fd1c-0dd8-48d3-8194-5d74198c652e" containerID="0bae5de10238a7beb8f0a7f3802a329a4df5969222160262a576adb89cbbfcdf" exitCode=0 Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.586568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7pbv8" event={"ID":"aec8fd1c-0dd8-48d3-8194-5d74198c652e","Type":"ContainerDied","Data":"0bae5de10238a7beb8f0a7f3802a329a4df5969222160262a576adb89cbbfcdf"} Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.588304 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4206356-ff23-4e94-b1bb-d27749ca895d" containerID="9248544ce2256e2693b971c6047c97c5fc06d849b130ad0e5fc7083448a0d67c" exitCode=0 Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.588346 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c907-account-create-update-flnnn" event={"ID":"f4206356-ff23-4e94-b1bb-d27749ca895d","Type":"ContainerDied","Data":"9248544ce2256e2693b971c6047c97c5fc06d849b130ad0e5fc7083448a0d67c"} Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.589835 4764 generic.go:334] "Generic (PLEG): container finished" podID="7fb0f4b5-4acb-421d-9df5-ed34eed17848" containerID="87866399c513616618074196708c942f4b19b7989a488296c5ac172d763c9e02" exitCode=0 Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.589904 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e671-account-create-update-np4kd" event={"ID":"7fb0f4b5-4acb-421d-9df5-ed34eed17848","Type":"ContainerDied","Data":"87866399c513616618074196708c942f4b19b7989a488296c5ac172d763c9e02"} Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.590260 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e671-account-create-update-np4kd" event={"ID":"7fb0f4b5-4acb-421d-9df5-ed34eed17848","Type":"ContainerStarted","Data":"c65c5baad9f2016af10333eea30731a525d7f45ed906b465b3ad874c43f97360"} Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.643602 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9drjm"] Mar 20 15:10:41 crc kubenswrapper[4764]: E0320 15:10:41.643958 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe" containerName="mariadb-account-create-update" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.643973 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe" containerName="mariadb-account-create-update" Mar 20 15:10:41 crc kubenswrapper[4764]: E0320 15:10:41.643992 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2661b6-35bf-4ad6-ad3d-b84de19786fb" containerName="dnsmasq-dns" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.643998 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2661b6-35bf-4ad6-ad3d-b84de19786fb" containerName="dnsmasq-dns" Mar 20 15:10:41 crc kubenswrapper[4764]: E0320 15:10:41.644015 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb2661b6-35bf-4ad6-ad3d-b84de19786fb" containerName="init" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.644021 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb2661b6-35bf-4ad6-ad3d-b84de19786fb" containerName="init" Mar 20 15:10:41 crc kubenswrapper[4764]: E0320 15:10:41.644033 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2" containerName="mariadb-database-create" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.644039 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2" containerName="mariadb-database-create" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.644178 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2" containerName="mariadb-database-create" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.644194 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb2661b6-35bf-4ad6-ad3d-b84de19786fb" containerName="dnsmasq-dns" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.644206 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe" containerName="mariadb-account-create-update" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.644712 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9drjm" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.647135 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.678816 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9drjm"] Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.771866 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e2b458-a224-4332-8751-c45f85743cb7-operator-scripts\") pod \"root-account-create-update-9drjm\" (UID: \"87e2b458-a224-4332-8751-c45f85743cb7\") " pod="openstack/root-account-create-update-9drjm" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.771954 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxhh9\" (UniqueName: \"kubernetes.io/projected/87e2b458-a224-4332-8751-c45f85743cb7-kube-api-access-dxhh9\") pod \"root-account-create-update-9drjm\" (UID: \"87e2b458-a224-4332-8751-c45f85743cb7\") " pod="openstack/root-account-create-update-9drjm" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.874329 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e2b458-a224-4332-8751-c45f85743cb7-operator-scripts\") pod \"root-account-create-update-9drjm\" (UID: \"87e2b458-a224-4332-8751-c45f85743cb7\") " pod="openstack/root-account-create-update-9drjm" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.874814 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxhh9\" (UniqueName: \"kubernetes.io/projected/87e2b458-a224-4332-8751-c45f85743cb7-kube-api-access-dxhh9\") pod \"root-account-create-update-9drjm\" (UID: \"87e2b458-a224-4332-8751-c45f85743cb7\") " pod="openstack/root-account-create-update-9drjm" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.875175 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e2b458-a224-4332-8751-c45f85743cb7-operator-scripts\") pod \"root-account-create-update-9drjm\" (UID: \"87e2b458-a224-4332-8751-c45f85743cb7\") " pod="openstack/root-account-create-update-9drjm" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.908904 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxhh9\" (UniqueName: \"kubernetes.io/projected/87e2b458-a224-4332-8751-c45f85743cb7-kube-api-access-dxhh9\") pod \"root-account-create-update-9drjm\" (UID: \"87e2b458-a224-4332-8751-c45f85743cb7\") " pod="openstack/root-account-create-update-9drjm" Mar 20 15:10:41 crc kubenswrapper[4764]: I0320 15:10:41.961409 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9drjm" Mar 20 15:10:42 crc kubenswrapper[4764]: I0320 15:10:42.405811 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9drjm"] Mar 20 15:10:42 crc kubenswrapper[4764]: I0320 15:10:42.598929 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9drjm" event={"ID":"87e2b458-a224-4332-8751-c45f85743cb7","Type":"ContainerStarted","Data":"b7c01949f13ac6f2adc1c5d1353984bc20bf37f57280f04d6cecee64bc4ff0fb"} Mar 20 15:10:42 crc kubenswrapper[4764]: I0320 15:10:42.598977 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9drjm" event={"ID":"87e2b458-a224-4332-8751-c45f85743cb7","Type":"ContainerStarted","Data":"ddeaacbeadeff3f9d27deacce1c943f079730143fd0df59e6e2219dc003a387d"} Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.021461 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7pbv8" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.136000 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b455533-d0f1-4672-b0c2-4a773e2caeb1" path="/var/lib/kubelet/pods/8b455533-d0f1-4672-b0c2-4a773e2caeb1/volumes" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.194245 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aec8fd1c-0dd8-48d3-8194-5d74198c652e-operator-scripts\") pod \"aec8fd1c-0dd8-48d3-8194-5d74198c652e\" (UID: \"aec8fd1c-0dd8-48d3-8194-5d74198c652e\") " Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.194398 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t4r7\" (UniqueName: \"kubernetes.io/projected/aec8fd1c-0dd8-48d3-8194-5d74198c652e-kube-api-access-8t4r7\") pod \"aec8fd1c-0dd8-48d3-8194-5d74198c652e\" (UID: \"aec8fd1c-0dd8-48d3-8194-5d74198c652e\") " Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.194956 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aec8fd1c-0dd8-48d3-8194-5d74198c652e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aec8fd1c-0dd8-48d3-8194-5d74198c652e" (UID: "aec8fd1c-0dd8-48d3-8194-5d74198c652e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.199580 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec8fd1c-0dd8-48d3-8194-5d74198c652e-kube-api-access-8t4r7" (OuterVolumeSpecName: "kube-api-access-8t4r7") pod "aec8fd1c-0dd8-48d3-8194-5d74198c652e" (UID: "aec8fd1c-0dd8-48d3-8194-5d74198c652e"). InnerVolumeSpecName "kube-api-access-8t4r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.253697 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e671-account-create-update-np4kd" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.257767 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c907-account-create-update-flnnn" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.261718 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tnk24" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.296140 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aec8fd1c-0dd8-48d3-8194-5d74198c652e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.296168 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t4r7\" (UniqueName: \"kubernetes.io/projected/aec8fd1c-0dd8-48d3-8194-5d74198c652e-kube-api-access-8t4r7\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.397612 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c-operator-scripts\") pod \"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c\" (UID: \"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c\") " Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.397703 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb0f4b5-4acb-421d-9df5-ed34eed17848-operator-scripts\") pod \"7fb0f4b5-4acb-421d-9df5-ed34eed17848\" (UID: \"7fb0f4b5-4acb-421d-9df5-ed34eed17848\") " Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.397732 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxlwg\" (UniqueName: \"kubernetes.io/projected/7fb0f4b5-4acb-421d-9df5-ed34eed17848-kube-api-access-rxlwg\") pod \"7fb0f4b5-4acb-421d-9df5-ed34eed17848\" (UID: \"7fb0f4b5-4acb-421d-9df5-ed34eed17848\") " Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.397797 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc296\" (UniqueName: \"kubernetes.io/projected/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c-kube-api-access-rc296\") pod \"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c\" (UID: \"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c\") " Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.397851 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkqfv\" (UniqueName: \"kubernetes.io/projected/f4206356-ff23-4e94-b1bb-d27749ca895d-kube-api-access-wkqfv\") pod \"f4206356-ff23-4e94-b1bb-d27749ca895d\" (UID: \"f4206356-ff23-4e94-b1bb-d27749ca895d\") " Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.397882 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4206356-ff23-4e94-b1bb-d27749ca895d-operator-scripts\") pod \"f4206356-ff23-4e94-b1bb-d27749ca895d\" (UID: \"f4206356-ff23-4e94-b1bb-d27749ca895d\") " Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.398409 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fb0f4b5-4acb-421d-9df5-ed34eed17848-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7fb0f4b5-4acb-421d-9df5-ed34eed17848" (UID: "7fb0f4b5-4acb-421d-9df5-ed34eed17848"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.398709 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4206356-ff23-4e94-b1bb-d27749ca895d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4206356-ff23-4e94-b1bb-d27749ca895d" (UID: "f4206356-ff23-4e94-b1bb-d27749ca895d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.398732 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d636f2ec-b9c4-4d44-be52-c1c6a570bf4c" (UID: "d636f2ec-b9c4-4d44-be52-c1c6a570bf4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.401734 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c-kube-api-access-rc296" (OuterVolumeSpecName: "kube-api-access-rc296") pod "d636f2ec-b9c4-4d44-be52-c1c6a570bf4c" (UID: "d636f2ec-b9c4-4d44-be52-c1c6a570bf4c"). InnerVolumeSpecName "kube-api-access-rc296". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.401824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb0f4b5-4acb-421d-9df5-ed34eed17848-kube-api-access-rxlwg" (OuterVolumeSpecName: "kube-api-access-rxlwg") pod "7fb0f4b5-4acb-421d-9df5-ed34eed17848" (UID: "7fb0f4b5-4acb-421d-9df5-ed34eed17848"). InnerVolumeSpecName "kube-api-access-rxlwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.402643 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4206356-ff23-4e94-b1bb-d27749ca895d-kube-api-access-wkqfv" (OuterVolumeSpecName: "kube-api-access-wkqfv") pod "f4206356-ff23-4e94-b1bb-d27749ca895d" (UID: "f4206356-ff23-4e94-b1bb-d27749ca895d"). InnerVolumeSpecName "kube-api-access-wkqfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.499406 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkqfv\" (UniqueName: \"kubernetes.io/projected/f4206356-ff23-4e94-b1bb-d27749ca895d-kube-api-access-wkqfv\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.499453 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4206356-ff23-4e94-b1bb-d27749ca895d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.499472 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.499489 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb0f4b5-4acb-421d-9df5-ed34eed17848-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.499505 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxlwg\" (UniqueName: \"kubernetes.io/projected/7fb0f4b5-4acb-421d-9df5-ed34eed17848-kube-api-access-rxlwg\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.499519 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc296\" (UniqueName: \"kubernetes.io/projected/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c-kube-api-access-rc296\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.607915 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7pbv8" event={"ID":"aec8fd1c-0dd8-48d3-8194-5d74198c652e","Type":"ContainerDied","Data":"2bba5da0a8508ad82efd57385299757885393cabd94bb00ca0fa284be99ae7df"} Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.608254 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bba5da0a8508ad82efd57385299757885393cabd94bb00ca0fa284be99ae7df" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.608227 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7pbv8" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.609796 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c907-account-create-update-flnnn" event={"ID":"f4206356-ff23-4e94-b1bb-d27749ca895d","Type":"ContainerDied","Data":"8d68e402fddd50f2aaa01d8ef928cd14a1642a1365bdd272701539339b4cfe4e"} Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.609826 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c907-account-create-update-flnnn" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.609829 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d68e402fddd50f2aaa01d8ef928cd14a1642a1365bdd272701539339b4cfe4e" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.611905 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e671-account-create-update-np4kd" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.611942 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e671-account-create-update-np4kd" event={"ID":"7fb0f4b5-4acb-421d-9df5-ed34eed17848","Type":"ContainerDied","Data":"c65c5baad9f2016af10333eea30731a525d7f45ed906b465b3ad874c43f97360"} Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.612114 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c65c5baad9f2016af10333eea30731a525d7f45ed906b465b3ad874c43f97360" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.614649 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-tnk24" event={"ID":"d636f2ec-b9c4-4d44-be52-c1c6a570bf4c","Type":"ContainerDied","Data":"9205eadc68e5766b0d3a5443adc83cac4092f99473a9c77a26c39ff1e064d8e3"} Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.614710 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9205eadc68e5766b0d3a5443adc83cac4092f99473a9c77a26c39ff1e064d8e3" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.614810 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-tnk24" Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.624589 4764 generic.go:334] "Generic (PLEG): container finished" podID="87e2b458-a224-4332-8751-c45f85743cb7" containerID="b7c01949f13ac6f2adc1c5d1353984bc20bf37f57280f04d6cecee64bc4ff0fb" exitCode=0 Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.624640 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9drjm" event={"ID":"87e2b458-a224-4332-8751-c45f85743cb7","Type":"ContainerDied","Data":"b7c01949f13ac6f2adc1c5d1353984bc20bf37f57280f04d6cecee64bc4ff0fb"} Mar 20 15:10:43 crc kubenswrapper[4764]: I0320 15:10:43.803548 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:43 crc kubenswrapper[4764]: E0320 15:10:43.803862 4764 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 15:10:43 crc kubenswrapper[4764]: E0320 15:10:43.803930 4764 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 15:10:43 crc kubenswrapper[4764]: E0320 15:10:43.804048 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift podName:d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0 nodeName:}" failed. No retries permitted until 2026-03-20 15:10:59.804008777 +0000 UTC m=+1181.420197946 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift") pod "swift-storage-0" (UID: "d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0") : configmap "swift-ring-files" not found Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.949154 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-klrf6"] Mar 20 15:10:44 crc kubenswrapper[4764]: E0320 15:10:44.949501 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4206356-ff23-4e94-b1bb-d27749ca895d" containerName="mariadb-account-create-update" Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.949538 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4206356-ff23-4e94-b1bb-d27749ca895d" containerName="mariadb-account-create-update" Mar 20 15:10:44 crc kubenswrapper[4764]: E0320 15:10:44.949558 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d636f2ec-b9c4-4d44-be52-c1c6a570bf4c" containerName="mariadb-database-create" Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.949564 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d636f2ec-b9c4-4d44-be52-c1c6a570bf4c" containerName="mariadb-database-create" Mar 20 15:10:44 crc kubenswrapper[4764]: E0320 15:10:44.949585 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec8fd1c-0dd8-48d3-8194-5d74198c652e" containerName="mariadb-database-create" Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.949591 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec8fd1c-0dd8-48d3-8194-5d74198c652e" containerName="mariadb-database-create" Mar 20 15:10:44 crc kubenswrapper[4764]: E0320 15:10:44.949602 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb0f4b5-4acb-421d-9df5-ed34eed17848" containerName="mariadb-account-create-update" Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.949607 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb0f4b5-4acb-421d-9df5-ed34eed17848" containerName="mariadb-account-create-update" Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.949792 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d636f2ec-b9c4-4d44-be52-c1c6a570bf4c" containerName="mariadb-database-create" Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.949807 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec8fd1c-0dd8-48d3-8194-5d74198c652e" containerName="mariadb-database-create" Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.949818 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb0f4b5-4acb-421d-9df5-ed34eed17848" containerName="mariadb-account-create-update" Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.949833 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4206356-ff23-4e94-b1bb-d27749ca895d" containerName="mariadb-account-create-update" Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.950320 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.956047 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.956273 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qbgl2" Mar 20 15:10:44 crc kubenswrapper[4764]: I0320 15:10:44.962124 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-klrf6"] Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.024228 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-combined-ca-bundle\") pod \"glance-db-sync-klrf6\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.024588 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-db-sync-config-data\") pod \"glance-db-sync-klrf6\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.024627 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xskm2\" (UniqueName: \"kubernetes.io/projected/adc582c0-f416-4991-89c7-9ddb850c0f2b-kube-api-access-xskm2\") pod \"glance-db-sync-klrf6\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.024773 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-config-data\") pod \"glance-db-sync-klrf6\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.066095 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9drjm" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.126355 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-combined-ca-bundle\") pod \"glance-db-sync-klrf6\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.126650 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-db-sync-config-data\") pod \"glance-db-sync-klrf6\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.126751 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xskm2\" (UniqueName: \"kubernetes.io/projected/adc582c0-f416-4991-89c7-9ddb850c0f2b-kube-api-access-xskm2\") pod \"glance-db-sync-klrf6\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.126837 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-config-data\") pod \"glance-db-sync-klrf6\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.132610 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-config-data\") pod \"glance-db-sync-klrf6\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.132933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-combined-ca-bundle\") pod \"glance-db-sync-klrf6\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.134298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-db-sync-config-data\") pod \"glance-db-sync-klrf6\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.146580 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xskm2\" (UniqueName: \"kubernetes.io/projected/adc582c0-f416-4991-89c7-9ddb850c0f2b-kube-api-access-xskm2\") pod \"glance-db-sync-klrf6\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.212677 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ncp4w" podUID="cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d" containerName="ovn-controller" probeResult="failure" output=< Mar 20 15:10:45 crc kubenswrapper[4764]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 15:10:45 crc kubenswrapper[4764]: > Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.227900 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e2b458-a224-4332-8751-c45f85743cb7-operator-scripts\") pod \"87e2b458-a224-4332-8751-c45f85743cb7\" (UID: \"87e2b458-a224-4332-8751-c45f85743cb7\") " Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.227992 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxhh9\" (UniqueName: \"kubernetes.io/projected/87e2b458-a224-4332-8751-c45f85743cb7-kube-api-access-dxhh9\") pod \"87e2b458-a224-4332-8751-c45f85743cb7\" (UID: \"87e2b458-a224-4332-8751-c45f85743cb7\") " Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.228553 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e2b458-a224-4332-8751-c45f85743cb7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87e2b458-a224-4332-8751-c45f85743cb7" (UID: "87e2b458-a224-4332-8751-c45f85743cb7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.231729 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e2b458-a224-4332-8751-c45f85743cb7-kube-api-access-dxhh9" (OuterVolumeSpecName: "kube-api-access-dxhh9") pod "87e2b458-a224-4332-8751-c45f85743cb7" (UID: "87e2b458-a224-4332-8751-c45f85743cb7"). InnerVolumeSpecName "kube-api-access-dxhh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.236974 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.246340 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-kb2ph" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.277217 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-klrf6" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.330249 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87e2b458-a224-4332-8751-c45f85743cb7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.330279 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxhh9\" (UniqueName: \"kubernetes.io/projected/87e2b458-a224-4332-8751-c45f85743cb7-kube-api-access-dxhh9\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.461389 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ncp4w-config-6m4t2"] Mar 20 15:10:45 crc kubenswrapper[4764]: E0320 15:10:45.464424 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e2b458-a224-4332-8751-c45f85743cb7" containerName="mariadb-account-create-update" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.464440 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e2b458-a224-4332-8751-c45f85743cb7" containerName="mariadb-account-create-update" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.464601 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e2b458-a224-4332-8751-c45f85743cb7" containerName="mariadb-account-create-update" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.465080 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.477129 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ncp4w-config-6m4t2"] Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.482029 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.640181 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp8tx\" (UniqueName: \"kubernetes.io/projected/b83b1169-c297-455f-a1a1-56c9088385c7-kube-api-access-lp8tx\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.640879 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-run\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.641035 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-log-ovn\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.641153 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-run-ovn\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.641309 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b83b1169-c297-455f-a1a1-56c9088385c7-scripts\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.641671 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b1169-c297-455f-a1a1-56c9088385c7-additional-scripts\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.645076 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9drjm" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.646564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9drjm" event={"ID":"87e2b458-a224-4332-8751-c45f85743cb7","Type":"ContainerDied","Data":"ddeaacbeadeff3f9d27deacce1c943f079730143fd0df59e6e2219dc003a387d"} Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.646603 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddeaacbeadeff3f9d27deacce1c943f079730143fd0df59e6e2219dc003a387d" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.744554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp8tx\" (UniqueName: \"kubernetes.io/projected/b83b1169-c297-455f-a1a1-56c9088385c7-kube-api-access-lp8tx\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.744615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-run\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.744635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-log-ovn\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.744663 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-run-ovn\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.744677 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b83b1169-c297-455f-a1a1-56c9088385c7-scripts\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.744766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b1169-c297-455f-a1a1-56c9088385c7-additional-scripts\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.745755 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-run\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.745826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-log-ovn\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.745832 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-run-ovn\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.746944 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b1169-c297-455f-a1a1-56c9088385c7-additional-scripts\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.747264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b83b1169-c297-455f-a1a1-56c9088385c7-scripts\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.766547 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp8tx\" (UniqueName: \"kubernetes.io/projected/b83b1169-c297-455f-a1a1-56c9088385c7-kube-api-access-lp8tx\") pod \"ovn-controller-ncp4w-config-6m4t2\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.819943 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-klrf6"] Mar 20 15:10:45 crc kubenswrapper[4764]: I0320 15:10:45.824930 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:45 crc kubenswrapper[4764]: W0320 15:10:45.825557 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadc582c0_f416_4991_89c7_9ddb850c0f2b.slice/crio-168320045af00982f9a50f42a241a3da2e2adfa969a45db72869897781097eb6 WatchSource:0}: Error finding container 168320045af00982f9a50f42a241a3da2e2adfa969a45db72869897781097eb6: Status 404 returned error can't find the container with id 168320045af00982f9a50f42a241a3da2e2adfa969a45db72869897781097eb6 Mar 20 15:10:46 crc kubenswrapper[4764]: I0320 15:10:46.293954 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ncp4w-config-6m4t2"] Mar 20 15:10:46 crc kubenswrapper[4764]: I0320 15:10:46.656448 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-klrf6" event={"ID":"adc582c0-f416-4991-89c7-9ddb850c0f2b","Type":"ContainerStarted","Data":"168320045af00982f9a50f42a241a3da2e2adfa969a45db72869897781097eb6"} Mar 20 15:10:46 crc kubenswrapper[4764]: I0320 15:10:46.658086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ncp4w-config-6m4t2" event={"ID":"b83b1169-c297-455f-a1a1-56c9088385c7","Type":"ContainerStarted","Data":"a39624500902aecf9b1f4fab12c4f21d0c0ba21c979c01022d316a34f7c2c00d"} Mar 20 15:10:46 crc kubenswrapper[4764]: I0320 15:10:46.658116 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ncp4w-config-6m4t2" event={"ID":"b83b1169-c297-455f-a1a1-56c9088385c7","Type":"ContainerStarted","Data":"e696f12abb69d5f7a658b8298f103d8b1dd6db8cc18db762422bfcf78de64f2b"} Mar 20 15:10:46 crc kubenswrapper[4764]: I0320 15:10:46.693734 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ncp4w-config-6m4t2" podStartSLOduration=1.693709777 podStartE2EDuration="1.693709777s" podCreationTimestamp="2026-03-20 15:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:10:46.690388594 +0000 UTC m=+1168.306577743" watchObservedRunningTime="2026-03-20 15:10:46.693709777 +0000 UTC m=+1168.309898916" Mar 20 15:10:47 crc kubenswrapper[4764]: I0320 15:10:47.672376 4764 generic.go:334] "Generic (PLEG): container finished" podID="14bf0f11-9be0-4cd6-9395-a9c2d4e12706" containerID="b2cb3dae89d9b4b16d2e829aa0fe421ead5c722733f85d94448cc7255790f662" exitCode=0 Mar 20 15:10:47 crc kubenswrapper[4764]: I0320 15:10:47.672461 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fqbvt" event={"ID":"14bf0f11-9be0-4cd6-9395-a9c2d4e12706","Type":"ContainerDied","Data":"b2cb3dae89d9b4b16d2e829aa0fe421ead5c722733f85d94448cc7255790f662"} Mar 20 15:10:47 crc kubenswrapper[4764]: I0320 15:10:47.678103 4764 generic.go:334] "Generic (PLEG): container finished" podID="b83b1169-c297-455f-a1a1-56c9088385c7" containerID="a39624500902aecf9b1f4fab12c4f21d0c0ba21c979c01022d316a34f7c2c00d" exitCode=0 Mar 20 15:10:47 crc kubenswrapper[4764]: I0320 15:10:47.678131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ncp4w-config-6m4t2" event={"ID":"b83b1169-c297-455f-a1a1-56c9088385c7","Type":"ContainerDied","Data":"a39624500902aecf9b1f4fab12c4f21d0c0ba21c979c01022d316a34f7c2c00d"} Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.024481 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.095475 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.202394 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-ring-data-devices\") pod \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.202440 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-swiftconf\") pod \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.202939 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b83b1169-c297-455f-a1a1-56c9088385c7" (UID: "b83b1169-c297-455f-a1a1-56c9088385c7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.202461 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-run-ovn\") pod \"b83b1169-c297-455f-a1a1-56c9088385c7\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203181 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-log-ovn\") pod \"b83b1169-c297-455f-a1a1-56c9088385c7\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203209 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2th5\" (UniqueName: \"kubernetes.io/projected/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-kube-api-access-b2th5\") pod \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203231 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b83b1169-c297-455f-a1a1-56c9088385c7-scripts\") pod \"b83b1169-c297-455f-a1a1-56c9088385c7\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203224 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "14bf0f11-9be0-4cd6-9395-a9c2d4e12706" (UID: "14bf0f11-9be0-4cd6-9395-a9c2d4e12706"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203260 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-etc-swift\") pod \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203302 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp8tx\" (UniqueName: \"kubernetes.io/projected/b83b1169-c297-455f-a1a1-56c9088385c7-kube-api-access-lp8tx\") pod \"b83b1169-c297-455f-a1a1-56c9088385c7\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203328 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-scripts\") pod \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203345 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-combined-ca-bundle\") pod \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203443 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-run\") pod \"b83b1169-c297-455f-a1a1-56c9088385c7\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203501 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b1169-c297-455f-a1a1-56c9088385c7-additional-scripts\") pod \"b83b1169-c297-455f-a1a1-56c9088385c7\" (UID: \"b83b1169-c297-455f-a1a1-56c9088385c7\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203533 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-dispersionconf\") pod \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\" (UID: \"14bf0f11-9be0-4cd6-9395-a9c2d4e12706\") " Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203694 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b83b1169-c297-455f-a1a1-56c9088385c7" (UID: "b83b1169-c297-455f-a1a1-56c9088385c7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203934 4764 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203955 4764 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203964 4764 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.203989 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-run" (OuterVolumeSpecName: "var-run") pod "b83b1169-c297-455f-a1a1-56c9088385c7" (UID: "b83b1169-c297-455f-a1a1-56c9088385c7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.205054 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b83b1169-c297-455f-a1a1-56c9088385c7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b83b1169-c297-455f-a1a1-56c9088385c7" (UID: "b83b1169-c297-455f-a1a1-56c9088385c7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.205266 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b83b1169-c297-455f-a1a1-56c9088385c7-scripts" (OuterVolumeSpecName: "scripts") pod "b83b1169-c297-455f-a1a1-56c9088385c7" (UID: "b83b1169-c297-455f-a1a1-56c9088385c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.206121 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "14bf0f11-9be0-4cd6-9395-a9c2d4e12706" (UID: "14bf0f11-9be0-4cd6-9395-a9c2d4e12706"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.208084 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-kube-api-access-b2th5" (OuterVolumeSpecName: "kube-api-access-b2th5") pod "14bf0f11-9be0-4cd6-9395-a9c2d4e12706" (UID: "14bf0f11-9be0-4cd6-9395-a9c2d4e12706"). InnerVolumeSpecName "kube-api-access-b2th5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.208781 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83b1169-c297-455f-a1a1-56c9088385c7-kube-api-access-lp8tx" (OuterVolumeSpecName: "kube-api-access-lp8tx") pod "b83b1169-c297-455f-a1a1-56c9088385c7" (UID: "b83b1169-c297-455f-a1a1-56c9088385c7"). InnerVolumeSpecName "kube-api-access-lp8tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.226062 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "14bf0f11-9be0-4cd6-9395-a9c2d4e12706" (UID: "14bf0f11-9be0-4cd6-9395-a9c2d4e12706"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.226257 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "14bf0f11-9be0-4cd6-9395-a9c2d4e12706" (UID: "14bf0f11-9be0-4cd6-9395-a9c2d4e12706"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.228831 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14bf0f11-9be0-4cd6-9395-a9c2d4e12706" (UID: "14bf0f11-9be0-4cd6-9395-a9c2d4e12706"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.238283 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-scripts" (OuterVolumeSpecName: "scripts") pod "14bf0f11-9be0-4cd6-9395-a9c2d4e12706" (UID: "14bf0f11-9be0-4cd6-9395-a9c2d4e12706"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.307913 4764 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.308529 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2th5\" (UniqueName: \"kubernetes.io/projected/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-kube-api-access-b2th5\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.308862 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b83b1169-c297-455f-a1a1-56c9088385c7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.308997 4764 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.309158 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp8tx\" (UniqueName: \"kubernetes.io/projected/b83b1169-c297-455f-a1a1-56c9088385c7-kube-api-access-lp8tx\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.309312 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.309568 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.309704 4764 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b83b1169-c297-455f-a1a1-56c9088385c7-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.309843 4764 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b1169-c297-455f-a1a1-56c9088385c7-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.310002 4764 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14bf0f11-9be0-4cd6-9395-a9c2d4e12706-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.698690 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fqbvt" event={"ID":"14bf0f11-9be0-4cd6-9395-a9c2d4e12706","Type":"ContainerDied","Data":"4bf3a6dcab4c2969b6afd660b2f690e1d91d3933ebd7c0287cda93374094e78b"} Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.698723 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fqbvt" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.698732 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bf3a6dcab4c2969b6afd660b2f690e1d91d3933ebd7c0287cda93374094e78b" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.701545 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ncp4w-config-6m4t2" event={"ID":"b83b1169-c297-455f-a1a1-56c9088385c7","Type":"ContainerDied","Data":"e696f12abb69d5f7a658b8298f103d8b1dd6db8cc18db762422bfcf78de64f2b"} Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.701630 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e696f12abb69d5f7a658b8298f103d8b1dd6db8cc18db762422bfcf78de64f2b" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.701595 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ncp4w-config-6m4t2" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.796400 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ncp4w-config-6m4t2"] Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.805358 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ncp4w-config-6m4t2"] Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.881849 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ncp4w-config-nr2g9"] Mar 20 15:10:49 crc kubenswrapper[4764]: E0320 15:10:49.882171 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83b1169-c297-455f-a1a1-56c9088385c7" containerName="ovn-config" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.882186 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83b1169-c297-455f-a1a1-56c9088385c7" containerName="ovn-config" Mar 20 15:10:49 crc kubenswrapper[4764]: E0320 15:10:49.882207 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14bf0f11-9be0-4cd6-9395-a9c2d4e12706" containerName="swift-ring-rebalance" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.882214 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="14bf0f11-9be0-4cd6-9395-a9c2d4e12706" containerName="swift-ring-rebalance" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.882378 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="14bf0f11-9be0-4cd6-9395-a9c2d4e12706" containerName="swift-ring-rebalance" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.882548 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83b1169-c297-455f-a1a1-56c9088385c7" containerName="ovn-config" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.883090 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.887412 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 15:10:49 crc kubenswrapper[4764]: I0320 15:10:49.897203 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ncp4w-config-nr2g9"] Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.029783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-log-ovn\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.029846 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-scripts\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.029876 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-run-ovn\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.029916 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-additional-scripts\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.030029 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v58l\" (UniqueName: \"kubernetes.io/projected/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-kube-api-access-2v58l\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.030104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-run\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.131517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-run\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.131628 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-log-ovn\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.131661 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-scripts\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.131685 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-run-ovn\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.131761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-additional-scripts\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.131827 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v58l\" (UniqueName: \"kubernetes.io/projected/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-kube-api-access-2v58l\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.131852 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-run\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.131876 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-log-ovn\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.131929 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-run-ovn\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.132444 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-additional-scripts\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.133609 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-scripts\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.163157 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v58l\" (UniqueName: \"kubernetes.io/projected/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-kube-api-access-2v58l\") pod \"ovn-controller-ncp4w-config-nr2g9\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.205742 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.216086 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ncp4w" Mar 20 15:10:50 crc kubenswrapper[4764]: I0320 15:10:50.750066 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ncp4w-config-nr2g9"] Mar 20 15:10:51 crc kubenswrapper[4764]: I0320 15:10:51.143969 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83b1169-c297-455f-a1a1-56c9088385c7" path="/var/lib/kubelet/pods/b83b1169-c297-455f-a1a1-56c9088385c7/volumes" Mar 20 15:10:51 crc kubenswrapper[4764]: I0320 15:10:51.479574 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:10:51 crc kubenswrapper[4764]: I0320 15:10:51.714850 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.344893 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-v78qk"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.346873 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v78qk" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.368095 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-v78qk"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.450076 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-88d7-account-create-update-p5wpr"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.451006 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-88d7-account-create-update-p5wpr" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.456610 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.467294 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-88d7-account-create-update-p5wpr"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.503493 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842127c9-b79d-4787-aa8f-8717e266f790-operator-scripts\") pod \"cinder-db-create-v78qk\" (UID: \"842127c9-b79d-4787-aa8f-8717e266f790\") " pod="openstack/cinder-db-create-v78qk" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.503560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l2jm\" (UniqueName: \"kubernetes.io/projected/842127c9-b79d-4787-aa8f-8717e266f790-kube-api-access-4l2jm\") pod \"cinder-db-create-v78qk\" (UID: \"842127c9-b79d-4787-aa8f-8717e266f790\") " pod="openstack/cinder-db-create-v78qk" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.604547 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d44ec84-1f1d-4477-8441-25159cc06b9e-operator-scripts\") pod \"cinder-88d7-account-create-update-p5wpr\" (UID: \"2d44ec84-1f1d-4477-8441-25159cc06b9e\") " pod="openstack/cinder-88d7-account-create-update-p5wpr" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.604636 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842127c9-b79d-4787-aa8f-8717e266f790-operator-scripts\") pod \"cinder-db-create-v78qk\" (UID: \"842127c9-b79d-4787-aa8f-8717e266f790\") " pod="openstack/cinder-db-create-v78qk" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.604690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l2jm\" (UniqueName: \"kubernetes.io/projected/842127c9-b79d-4787-aa8f-8717e266f790-kube-api-access-4l2jm\") pod \"cinder-db-create-v78qk\" (UID: \"842127c9-b79d-4787-aa8f-8717e266f790\") " pod="openstack/cinder-db-create-v78qk" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.604737 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q5w6\" (UniqueName: \"kubernetes.io/projected/2d44ec84-1f1d-4477-8441-25159cc06b9e-kube-api-access-2q5w6\") pod \"cinder-88d7-account-create-update-p5wpr\" (UID: \"2d44ec84-1f1d-4477-8441-25159cc06b9e\") " pod="openstack/cinder-88d7-account-create-update-p5wpr" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.605483 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842127c9-b79d-4787-aa8f-8717e266f790-operator-scripts\") pod \"cinder-db-create-v78qk\" (UID: \"842127c9-b79d-4787-aa8f-8717e266f790\") " pod="openstack/cinder-db-create-v78qk" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.625133 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l2jm\" (UniqueName: \"kubernetes.io/projected/842127c9-b79d-4787-aa8f-8717e266f790-kube-api-access-4l2jm\") pod \"cinder-db-create-v78qk\" (UID: \"842127c9-b79d-4787-aa8f-8717e266f790\") " pod="openstack/cinder-db-create-v78qk" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.660818 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-2tzqs"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.661980 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2tzqs" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.663277 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v78qk" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.677623 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0631-account-create-update-7kmh8"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.678916 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0631-account-create-update-7kmh8" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.688850 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.692197 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2tzqs"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.706161 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q5w6\" (UniqueName: \"kubernetes.io/projected/2d44ec84-1f1d-4477-8441-25159cc06b9e-kube-api-access-2q5w6\") pod \"cinder-88d7-account-create-update-p5wpr\" (UID: \"2d44ec84-1f1d-4477-8441-25159cc06b9e\") " pod="openstack/cinder-88d7-account-create-update-p5wpr" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.706289 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d44ec84-1f1d-4477-8441-25159cc06b9e-operator-scripts\") pod \"cinder-88d7-account-create-update-p5wpr\" (UID: \"2d44ec84-1f1d-4477-8441-25159cc06b9e\") " pod="openstack/cinder-88d7-account-create-update-p5wpr" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.707159 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d44ec84-1f1d-4477-8441-25159cc06b9e-operator-scripts\") pod \"cinder-88d7-account-create-update-p5wpr\" (UID: \"2d44ec84-1f1d-4477-8441-25159cc06b9e\") " pod="openstack/cinder-88d7-account-create-update-p5wpr" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.715012 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0631-account-create-update-7kmh8"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.739468 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q5w6\" (UniqueName: \"kubernetes.io/projected/2d44ec84-1f1d-4477-8441-25159cc06b9e-kube-api-access-2q5w6\") pod \"cinder-88d7-account-create-update-p5wpr\" (UID: \"2d44ec84-1f1d-4477-8441-25159cc06b9e\") " pod="openstack/cinder-88d7-account-create-update-p5wpr" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.770360 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-88d7-account-create-update-p5wpr" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.770651 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7fdgf"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.772430 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7fdgf" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.781183 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7fdgf"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.807944 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a90e7d64-0e27-4264-af29-d75b18ab3156-operator-scripts\") pod \"barbican-db-create-2tzqs\" (UID: \"a90e7d64-0e27-4264-af29-d75b18ab3156\") " pod="openstack/barbican-db-create-2tzqs" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.808060 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqkk7\" (UniqueName: \"kubernetes.io/projected/0f42236b-4110-466f-8ab4-6ebaafd5e570-kube-api-access-qqkk7\") pod \"barbican-0631-account-create-update-7kmh8\" (UID: \"0f42236b-4110-466f-8ab4-6ebaafd5e570\") " pod="openstack/barbican-0631-account-create-update-7kmh8" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.808200 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c72qz\" (UniqueName: \"kubernetes.io/projected/a90e7d64-0e27-4264-af29-d75b18ab3156-kube-api-access-c72qz\") pod \"barbican-db-create-2tzqs\" (UID: \"a90e7d64-0e27-4264-af29-d75b18ab3156\") " pod="openstack/barbican-db-create-2tzqs" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.808230 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f42236b-4110-466f-8ab4-6ebaafd5e570-operator-scripts\") pod \"barbican-0631-account-create-update-7kmh8\" (UID: \"0f42236b-4110-466f-8ab4-6ebaafd5e570\") " pod="openstack/barbican-0631-account-create-update-7kmh8" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.832917 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wmzgg"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.835411 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.841412 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rhrt4" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.841683 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.847676 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.848062 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.863414 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wmzgg"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.873801 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-329f-account-create-update-qxhhn"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.874955 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-329f-account-create-update-qxhhn" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.877248 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.884446 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-329f-account-create-update-qxhhn"] Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.911677 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqkk7\" (UniqueName: \"kubernetes.io/projected/0f42236b-4110-466f-8ab4-6ebaafd5e570-kube-api-access-qqkk7\") pod \"barbican-0631-account-create-update-7kmh8\" (UID: \"0f42236b-4110-466f-8ab4-6ebaafd5e570\") " pod="openstack/barbican-0631-account-create-update-7kmh8" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.911776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a827a7e-f360-4a04-9e24-0405b61b9501-operator-scripts\") pod \"neutron-db-create-7fdgf\" (UID: \"7a827a7e-f360-4a04-9e24-0405b61b9501\") " pod="openstack/neutron-db-create-7fdgf" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.911813 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c72qz\" (UniqueName: \"kubernetes.io/projected/a90e7d64-0e27-4264-af29-d75b18ab3156-kube-api-access-c72qz\") pod \"barbican-db-create-2tzqs\" (UID: \"a90e7d64-0e27-4264-af29-d75b18ab3156\") " pod="openstack/barbican-db-create-2tzqs" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.911836 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f42236b-4110-466f-8ab4-6ebaafd5e570-operator-scripts\") pod \"barbican-0631-account-create-update-7kmh8\" (UID: \"0f42236b-4110-466f-8ab4-6ebaafd5e570\") " pod="openstack/barbican-0631-account-create-update-7kmh8" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.911866 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a90e7d64-0e27-4264-af29-d75b18ab3156-operator-scripts\") pod \"barbican-db-create-2tzqs\" (UID: \"a90e7d64-0e27-4264-af29-d75b18ab3156\") " pod="openstack/barbican-db-create-2tzqs" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.911922 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbq5\" (UniqueName: \"kubernetes.io/projected/7a827a7e-f360-4a04-9e24-0405b61b9501-kube-api-access-snbq5\") pod \"neutron-db-create-7fdgf\" (UID: \"7a827a7e-f360-4a04-9e24-0405b61b9501\") " pod="openstack/neutron-db-create-7fdgf" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.912778 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a90e7d64-0e27-4264-af29-d75b18ab3156-operator-scripts\") pod \"barbican-db-create-2tzqs\" (UID: \"a90e7d64-0e27-4264-af29-d75b18ab3156\") " pod="openstack/barbican-db-create-2tzqs" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.915487 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f42236b-4110-466f-8ab4-6ebaafd5e570-operator-scripts\") pod \"barbican-0631-account-create-update-7kmh8\" (UID: \"0f42236b-4110-466f-8ab4-6ebaafd5e570\") " pod="openstack/barbican-0631-account-create-update-7kmh8" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.927488 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqkk7\" (UniqueName: \"kubernetes.io/projected/0f42236b-4110-466f-8ab4-6ebaafd5e570-kube-api-access-qqkk7\") pod \"barbican-0631-account-create-update-7kmh8\" (UID: \"0f42236b-4110-466f-8ab4-6ebaafd5e570\") " pod="openstack/barbican-0631-account-create-update-7kmh8" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.929332 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c72qz\" (UniqueName: \"kubernetes.io/projected/a90e7d64-0e27-4264-af29-d75b18ab3156-kube-api-access-c72qz\") pod \"barbican-db-create-2tzqs\" (UID: \"a90e7d64-0e27-4264-af29-d75b18ab3156\") " pod="openstack/barbican-db-create-2tzqs" Mar 20 15:10:53 crc kubenswrapper[4764]: I0320 15:10:53.984859 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2tzqs" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.001482 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0631-account-create-update-7kmh8" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.013349 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcr6v\" (UniqueName: \"kubernetes.io/projected/b825928a-6583-4399-85ba-559a5f3081a0-kube-api-access-fcr6v\") pod \"keystone-db-sync-wmzgg\" (UID: \"b825928a-6583-4399-85ba-559a5f3081a0\") " pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.013404 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b825928a-6583-4399-85ba-559a5f3081a0-combined-ca-bundle\") pod \"keystone-db-sync-wmzgg\" (UID: \"b825928a-6583-4399-85ba-559a5f3081a0\") " pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.013524 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbq5\" (UniqueName: \"kubernetes.io/projected/7a827a7e-f360-4a04-9e24-0405b61b9501-kube-api-access-snbq5\") pod \"neutron-db-create-7fdgf\" (UID: \"7a827a7e-f360-4a04-9e24-0405b61b9501\") " pod="openstack/neutron-db-create-7fdgf" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.013591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe206186-2715-4930-abb1-5917419fd021-operator-scripts\") pod \"neutron-329f-account-create-update-qxhhn\" (UID: \"fe206186-2715-4930-abb1-5917419fd021\") " pod="openstack/neutron-329f-account-create-update-qxhhn" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.013621 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a827a7e-f360-4a04-9e24-0405b61b9501-operator-scripts\") pod \"neutron-db-create-7fdgf\" (UID: \"7a827a7e-f360-4a04-9e24-0405b61b9501\") " pod="openstack/neutron-db-create-7fdgf" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.013643 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xktwp\" (UniqueName: \"kubernetes.io/projected/fe206186-2715-4930-abb1-5917419fd021-kube-api-access-xktwp\") pod \"neutron-329f-account-create-update-qxhhn\" (UID: \"fe206186-2715-4930-abb1-5917419fd021\") " pod="openstack/neutron-329f-account-create-update-qxhhn" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.013676 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b825928a-6583-4399-85ba-559a5f3081a0-config-data\") pod \"keystone-db-sync-wmzgg\" (UID: \"b825928a-6583-4399-85ba-559a5f3081a0\") " pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.014780 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a827a7e-f360-4a04-9e24-0405b61b9501-operator-scripts\") pod \"neutron-db-create-7fdgf\" (UID: \"7a827a7e-f360-4a04-9e24-0405b61b9501\") " pod="openstack/neutron-db-create-7fdgf" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.029178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbq5\" (UniqueName: \"kubernetes.io/projected/7a827a7e-f360-4a04-9e24-0405b61b9501-kube-api-access-snbq5\") pod \"neutron-db-create-7fdgf\" (UID: \"7a827a7e-f360-4a04-9e24-0405b61b9501\") " pod="openstack/neutron-db-create-7fdgf" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.113693 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7fdgf" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.115310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe206186-2715-4930-abb1-5917419fd021-operator-scripts\") pod \"neutron-329f-account-create-update-qxhhn\" (UID: \"fe206186-2715-4930-abb1-5917419fd021\") " pod="openstack/neutron-329f-account-create-update-qxhhn" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.115367 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xktwp\" (UniqueName: \"kubernetes.io/projected/fe206186-2715-4930-abb1-5917419fd021-kube-api-access-xktwp\") pod \"neutron-329f-account-create-update-qxhhn\" (UID: \"fe206186-2715-4930-abb1-5917419fd021\") " pod="openstack/neutron-329f-account-create-update-qxhhn" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.115439 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b825928a-6583-4399-85ba-559a5f3081a0-config-data\") pod \"keystone-db-sync-wmzgg\" (UID: \"b825928a-6583-4399-85ba-559a5f3081a0\") " pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.115482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcr6v\" (UniqueName: \"kubernetes.io/projected/b825928a-6583-4399-85ba-559a5f3081a0-kube-api-access-fcr6v\") pod \"keystone-db-sync-wmzgg\" (UID: \"b825928a-6583-4399-85ba-559a5f3081a0\") " pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.115510 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b825928a-6583-4399-85ba-559a5f3081a0-combined-ca-bundle\") pod \"keystone-db-sync-wmzgg\" (UID: \"b825928a-6583-4399-85ba-559a5f3081a0\") " pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.116287 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe206186-2715-4930-abb1-5917419fd021-operator-scripts\") pod \"neutron-329f-account-create-update-qxhhn\" (UID: \"fe206186-2715-4930-abb1-5917419fd021\") " pod="openstack/neutron-329f-account-create-update-qxhhn" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.121626 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b825928a-6583-4399-85ba-559a5f3081a0-combined-ca-bundle\") pod \"keystone-db-sync-wmzgg\" (UID: \"b825928a-6583-4399-85ba-559a5f3081a0\") " pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.121703 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b825928a-6583-4399-85ba-559a5f3081a0-config-data\") pod \"keystone-db-sync-wmzgg\" (UID: \"b825928a-6583-4399-85ba-559a5f3081a0\") " pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.135285 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xktwp\" (UniqueName: \"kubernetes.io/projected/fe206186-2715-4930-abb1-5917419fd021-kube-api-access-xktwp\") pod \"neutron-329f-account-create-update-qxhhn\" (UID: \"fe206186-2715-4930-abb1-5917419fd021\") " pod="openstack/neutron-329f-account-create-update-qxhhn" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.138701 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcr6v\" (UniqueName: \"kubernetes.io/projected/b825928a-6583-4399-85ba-559a5f3081a0-kube-api-access-fcr6v\") pod \"keystone-db-sync-wmzgg\" (UID: \"b825928a-6583-4399-85ba-559a5f3081a0\") " pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.165012 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:10:54 crc kubenswrapper[4764]: I0320 15:10:54.200227 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-329f-account-create-update-qxhhn" Mar 20 15:10:57 crc kubenswrapper[4764]: I0320 15:10:57.251313 4764 scope.go:117] "RemoveContainer" containerID="7c79250be52e18ff86bd91d4d30432738d4258e40ff75bbeb4bceb81b5793d72" Mar 20 15:10:59 crc kubenswrapper[4764]: I0320 15:10:59.828198 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:10:59 crc kubenswrapper[4764]: I0320 15:10:59.846282 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0-etc-swift\") pod \"swift-storage-0\" (UID: \"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0\") " pod="openstack/swift-storage-0" Mar 20 15:11:00 crc kubenswrapper[4764]: I0320 15:11:00.021120 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 15:11:02 crc kubenswrapper[4764]: E0320 15:11:02.211222 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Mar 20 15:11:02 crc kubenswrapper[4764]: E0320 15:11:02.211689 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xskm2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-klrf6_openstack(adc582c0-f416-4991-89c7-9ddb850c0f2b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:11:02 crc kubenswrapper[4764]: E0320 15:11:02.212932 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-klrf6" podUID="adc582c0-f416-4991-89c7-9ddb850c0f2b" Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:02.757596 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-v78qk"] Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:02.827631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ncp4w-config-nr2g9" event={"ID":"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f","Type":"ContainerStarted","Data":"1c8381cc390c8c464e734160bcb307849ba72cceaf08235c770d7cfde7ce8aaf"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:02.827916 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ncp4w-config-nr2g9" event={"ID":"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f","Type":"ContainerStarted","Data":"fc3758614cca8a0268948d112525eb8a878df817639638579415b79a6cd81839"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:02.829059 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v78qk" event={"ID":"842127c9-b79d-4787-aa8f-8717e266f790","Type":"ContainerStarted","Data":"bf1a92e9279b1fc3ef09587e553cc987798de6fe50d349f24b2d32f870de2f20"} Mar 20 15:11:03 crc kubenswrapper[4764]: E0320 15:11:02.830326 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-klrf6" podUID="adc582c0-f416-4991-89c7-9ddb850c0f2b" Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:02.850817 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ncp4w-config-nr2g9" podStartSLOduration=13.850804273 podStartE2EDuration="13.850804273s" podCreationTimestamp="2026-03-20 15:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:02.84718892 +0000 UTC m=+1184.463378049" watchObservedRunningTime="2026-03-20 15:11:02.850804273 +0000 UTC m=+1184.466993402" Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.426604 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wmzgg"] Mar 20 15:11:03 crc kubenswrapper[4764]: W0320 15:11:03.431301 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb825928a_6583_4399_85ba_559a5f3081a0.slice/crio-1d459cf6c6323fa73e7b164809b2b7f53c5564f45070598594b8fd8736f1da1d WatchSource:0}: Error finding container 1d459cf6c6323fa73e7b164809b2b7f53c5564f45070598594b8fd8736f1da1d: Status 404 returned error can't find the container with id 1d459cf6c6323fa73e7b164809b2b7f53c5564f45070598594b8fd8736f1da1d Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.483441 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2tzqs"] Mar 20 15:11:03 crc kubenswrapper[4764]: W0320 15:11:03.491726 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda90e7d64_0e27_4264_af29_d75b18ab3156.slice/crio-270792c0d33597bce90c3004edff378d7db45ddc453470fda2d425d52a0d01bf WatchSource:0}: Error finding container 270792c0d33597bce90c3004edff378d7db45ddc453470fda2d425d52a0d01bf: Status 404 returned error can't find the container with id 270792c0d33597bce90c3004edff378d7db45ddc453470fda2d425d52a0d01bf Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.496167 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7fdgf"] Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.514173 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-88d7-account-create-update-p5wpr"] Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.536723 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-329f-account-create-update-qxhhn"] Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.545225 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0631-account-create-update-7kmh8"] Mar 20 15:11:03 crc kubenswrapper[4764]: W0320 15:11:03.550459 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe206186_2715_4930_abb1_5917419fd021.slice/crio-397336bb7b2a138d225f1f96b37aa4c1e7770cd251b30df194ae410de856a073 WatchSource:0}: Error finding container 397336bb7b2a138d225f1f96b37aa4c1e7770cd251b30df194ae410de856a073: Status 404 returned error can't find the container with id 397336bb7b2a138d225f1f96b37aa4c1e7770cd251b30df194ae410de856a073 Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.604721 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.836880 4764 generic.go:334] "Generic (PLEG): container finished" podID="5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f" containerID="1c8381cc390c8c464e734160bcb307849ba72cceaf08235c770d7cfde7ce8aaf" exitCode=0 Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.836944 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ncp4w-config-nr2g9" event={"ID":"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f","Type":"ContainerDied","Data":"1c8381cc390c8c464e734160bcb307849ba72cceaf08235c770d7cfde7ce8aaf"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.838396 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2tzqs" event={"ID":"a90e7d64-0e27-4264-af29-d75b18ab3156","Type":"ContainerStarted","Data":"d35f6e46cb888f183b770e4a945c501f645c5991ed370bf8765dc8619fb449b0"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.838440 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2tzqs" event={"ID":"a90e7d64-0e27-4264-af29-d75b18ab3156","Type":"ContainerStarted","Data":"270792c0d33597bce90c3004edff378d7db45ddc453470fda2d425d52a0d01bf"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.839874 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-329f-account-create-update-qxhhn" event={"ID":"fe206186-2715-4930-abb1-5917419fd021","Type":"ContainerStarted","Data":"ad2e1c786cbc1c21a38880b3bb055dda6c21dfc3c98d0181be6a48931b287ccf"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.839896 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-329f-account-create-update-qxhhn" event={"ID":"fe206186-2715-4930-abb1-5917419fd021","Type":"ContainerStarted","Data":"397336bb7b2a138d225f1f96b37aa4c1e7770cd251b30df194ae410de856a073"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.841214 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-88d7-account-create-update-p5wpr" event={"ID":"2d44ec84-1f1d-4477-8441-25159cc06b9e","Type":"ContainerStarted","Data":"72df290a11c89fd7dd7e7193ad210c40d5174fa09d19a9e166161f55e97e46ce"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.841259 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-88d7-account-create-update-p5wpr" event={"ID":"2d44ec84-1f1d-4477-8441-25159cc06b9e","Type":"ContainerStarted","Data":"4f638ed761d3d2184e72c24d77fd4cf38f16d9dee5d18caf2bc4f537a3577793"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.842448 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7fdgf" event={"ID":"7a827a7e-f360-4a04-9e24-0405b61b9501","Type":"ContainerStarted","Data":"ed9ab4db0648fec83e64bc230bb94ffa068fcdcaea68daef1abd99f722599c2d"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.842490 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7fdgf" event={"ID":"7a827a7e-f360-4a04-9e24-0405b61b9501","Type":"ContainerStarted","Data":"5b842c19b37dc88d4b238e0eaee49f9e863a8a14d8d62a5b60096e1e1ee3a868"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.843794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"8376b2b4027a50e3274e254fea545d46442b92dad03b9eafb9ba1cc7c04a02b9"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.845169 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0631-account-create-update-7kmh8" event={"ID":"0f42236b-4110-466f-8ab4-6ebaafd5e570","Type":"ContainerStarted","Data":"107825f0c41ee53148d8ac0a49582e7792b029b37b9e4d8765d2dd015dd2650e"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.845193 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0631-account-create-update-7kmh8" event={"ID":"0f42236b-4110-466f-8ab4-6ebaafd5e570","Type":"ContainerStarted","Data":"e7c11d4ed320a58e1e889bed3d92a5087ad116fc0264d94f5ab61865575d7920"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.846139 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wmzgg" event={"ID":"b825928a-6583-4399-85ba-559a5f3081a0","Type":"ContainerStarted","Data":"1d459cf6c6323fa73e7b164809b2b7f53c5564f45070598594b8fd8736f1da1d"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.847650 4764 generic.go:334] "Generic (PLEG): container finished" podID="842127c9-b79d-4787-aa8f-8717e266f790" containerID="1483c8585e5e4572c9912caabc2e2629a26afc72bb3fbedb1d51877c7290733d" exitCode=0 Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.847701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v78qk" event={"ID":"842127c9-b79d-4787-aa8f-8717e266f790","Type":"ContainerDied","Data":"1483c8585e5e4572c9912caabc2e2629a26afc72bb3fbedb1d51877c7290733d"} Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.876221 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-7fdgf" podStartSLOduration=10.876204295 podStartE2EDuration="10.876204295s" podCreationTimestamp="2026-03-20 15:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:03.873897134 +0000 UTC m=+1185.490086263" watchObservedRunningTime="2026-03-20 15:11:03.876204295 +0000 UTC m=+1185.492393424" Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.892423 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-2tzqs" podStartSLOduration=10.892402757 podStartE2EDuration="10.892402757s" podCreationTimestamp="2026-03-20 15:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:03.889916869 +0000 UTC m=+1185.506106028" watchObservedRunningTime="2026-03-20 15:11:03.892402757 +0000 UTC m=+1185.508591896" Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.905541 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-329f-account-create-update-qxhhn" podStartSLOduration=10.905523862999999 podStartE2EDuration="10.905523863s" podCreationTimestamp="2026-03-20 15:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:03.903340145 +0000 UTC m=+1185.519529274" watchObservedRunningTime="2026-03-20 15:11:03.905523863 +0000 UTC m=+1185.521712992" Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.922744 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-0631-account-create-update-7kmh8" podStartSLOduration=10.922728065 podStartE2EDuration="10.922728065s" podCreationTimestamp="2026-03-20 15:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:03.918209485 +0000 UTC m=+1185.534398614" watchObservedRunningTime="2026-03-20 15:11:03.922728065 +0000 UTC m=+1185.538917194" Mar 20 15:11:03 crc kubenswrapper[4764]: I0320 15:11:03.942233 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-88d7-account-create-update-p5wpr" podStartSLOduration=10.942211318 podStartE2EDuration="10.942211318s" podCreationTimestamp="2026-03-20 15:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:03.933686834 +0000 UTC m=+1185.549875963" watchObservedRunningTime="2026-03-20 15:11:03.942211318 +0000 UTC m=+1185.558400447" Mar 20 15:11:04 crc kubenswrapper[4764]: I0320 15:11:04.857750 4764 generic.go:334] "Generic (PLEG): container finished" podID="0f42236b-4110-466f-8ab4-6ebaafd5e570" containerID="107825f0c41ee53148d8ac0a49582e7792b029b37b9e4d8765d2dd015dd2650e" exitCode=0 Mar 20 15:11:04 crc kubenswrapper[4764]: I0320 15:11:04.857816 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0631-account-create-update-7kmh8" event={"ID":"0f42236b-4110-466f-8ab4-6ebaafd5e570","Type":"ContainerDied","Data":"107825f0c41ee53148d8ac0a49582e7792b029b37b9e4d8765d2dd015dd2650e"} Mar 20 15:11:04 crc kubenswrapper[4764]: I0320 15:11:04.866175 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-329f-account-create-update-qxhhn" event={"ID":"fe206186-2715-4930-abb1-5917419fd021","Type":"ContainerDied","Data":"ad2e1c786cbc1c21a38880b3bb055dda6c21dfc3c98d0181be6a48931b287ccf"} Mar 20 15:11:04 crc kubenswrapper[4764]: I0320 15:11:04.866153 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe206186-2715-4930-abb1-5917419fd021" containerID="ad2e1c786cbc1c21a38880b3bb055dda6c21dfc3c98d0181be6a48931b287ccf" exitCode=0 Mar 20 15:11:04 crc kubenswrapper[4764]: I0320 15:11:04.882591 4764 generic.go:334] "Generic (PLEG): container finished" podID="2d44ec84-1f1d-4477-8441-25159cc06b9e" containerID="72df290a11c89fd7dd7e7193ad210c40d5174fa09d19a9e166161f55e97e46ce" exitCode=0 Mar 20 15:11:04 crc kubenswrapper[4764]: I0320 15:11:04.882746 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-88d7-account-create-update-p5wpr" event={"ID":"2d44ec84-1f1d-4477-8441-25159cc06b9e","Type":"ContainerDied","Data":"72df290a11c89fd7dd7e7193ad210c40d5174fa09d19a9e166161f55e97e46ce"} Mar 20 15:11:04 crc kubenswrapper[4764]: I0320 15:11:04.894924 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a827a7e-f360-4a04-9e24-0405b61b9501" containerID="ed9ab4db0648fec83e64bc230bb94ffa068fcdcaea68daef1abd99f722599c2d" exitCode=0 Mar 20 15:11:04 crc kubenswrapper[4764]: I0320 15:11:04.895064 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7fdgf" event={"ID":"7a827a7e-f360-4a04-9e24-0405b61b9501","Type":"ContainerDied","Data":"ed9ab4db0648fec83e64bc230bb94ffa068fcdcaea68daef1abd99f722599c2d"} Mar 20 15:11:04 crc kubenswrapper[4764]: I0320 15:11:04.898706 4764 generic.go:334] "Generic (PLEG): container finished" podID="a90e7d64-0e27-4264-af29-d75b18ab3156" containerID="d35f6e46cb888f183b770e4a945c501f645c5991ed370bf8765dc8619fb449b0" exitCode=0 Mar 20 15:11:04 crc kubenswrapper[4764]: I0320 15:11:04.898904 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2tzqs" event={"ID":"a90e7d64-0e27-4264-af29-d75b18ab3156","Type":"ContainerDied","Data":"d35f6e46cb888f183b770e4a945c501f645c5991ed370bf8765dc8619fb449b0"} Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.261933 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.426686 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v58l\" (UniqueName: \"kubernetes.io/projected/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-kube-api-access-2v58l\") pod \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.426830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-run\") pod \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.426950 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-additional-scripts\") pod \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.426989 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-scripts\") pod \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.427026 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-log-ovn\") pod \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.427045 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-run-ovn\") pod \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\" (UID: \"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f\") " Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.427547 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f" (UID: "5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.427590 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-run" (OuterVolumeSpecName: "var-run") pod "5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f" (UID: "5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.427584 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f" (UID: "5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.428361 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-scripts" (OuterVolumeSpecName: "scripts") pod "5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f" (UID: "5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.428537 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f" (UID: "5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.432433 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-kube-api-access-2v58l" (OuterVolumeSpecName: "kube-api-access-2v58l") pod "5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f" (UID: "5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f"). InnerVolumeSpecName "kube-api-access-2v58l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.531103 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v58l\" (UniqueName: \"kubernetes.io/projected/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-kube-api-access-2v58l\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.531143 4764 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.531159 4764 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.531171 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.531183 4764 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.531193 4764 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.919877 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ncp4w-config-nr2g9"] Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.922076 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ncp4w-config-nr2g9" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.922075 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ncp4w-config-nr2g9" event={"ID":"5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f","Type":"ContainerDied","Data":"fc3758614cca8a0268948d112525eb8a878df817639638579415b79a6cd81839"} Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.922257 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc3758614cca8a0268948d112525eb8a878df817639638579415b79a6cd81839" Mar 20 15:11:05 crc kubenswrapper[4764]: I0320 15:11:05.930450 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ncp4w-config-nr2g9"] Mar 20 15:11:06 crc kubenswrapper[4764]: E0320 15:11:06.137050 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e45cbee_7c7e_4bd7_89f4_f49d6c044a6f.slice/crio-fc3758614cca8a0268948d112525eb8a878df817639638579415b79a6cd81839\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e45cbee_7c7e_4bd7_89f4_f49d6c044a6f.slice\": RecentStats: unable to find data in memory cache]" Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.136829 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f" path="/var/lib/kubelet/pods/5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f/volumes" Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.948329 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0631-account-create-update-7kmh8" event={"ID":"0f42236b-4110-466f-8ab4-6ebaafd5e570","Type":"ContainerDied","Data":"e7c11d4ed320a58e1e889bed3d92a5087ad116fc0264d94f5ab61865575d7920"} Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.948798 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7c11d4ed320a58e1e889bed3d92a5087ad116fc0264d94f5ab61865575d7920" Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.952101 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-329f-account-create-update-qxhhn" Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.953957 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-88d7-account-create-update-p5wpr" event={"ID":"2d44ec84-1f1d-4477-8441-25159cc06b9e","Type":"ContainerDied","Data":"4f638ed761d3d2184e72c24d77fd4cf38f16d9dee5d18caf2bc4f537a3577793"} Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.953995 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f638ed761d3d2184e72c24d77fd4cf38f16d9dee5d18caf2bc4f537a3577793" Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.957348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v78qk" event={"ID":"842127c9-b79d-4787-aa8f-8717e266f790","Type":"ContainerDied","Data":"bf1a92e9279b1fc3ef09587e553cc987798de6fe50d349f24b2d32f870de2f20"} Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.957372 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf1a92e9279b1fc3ef09587e553cc987798de6fe50d349f24b2d32f870de2f20" Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.959988 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7fdgf" event={"ID":"7a827a7e-f360-4a04-9e24-0405b61b9501","Type":"ContainerDied","Data":"5b842c19b37dc88d4b238e0eaee49f9e863a8a14d8d62a5b60096e1e1ee3a868"} Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.960010 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b842c19b37dc88d4b238e0eaee49f9e863a8a14d8d62a5b60096e1e1ee3a868" Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.962070 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2tzqs" event={"ID":"a90e7d64-0e27-4264-af29-d75b18ab3156","Type":"ContainerDied","Data":"270792c0d33597bce90c3004edff378d7db45ddc453470fda2d425d52a0d01bf"} Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.962091 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="270792c0d33597bce90c3004edff378d7db45ddc453470fda2d425d52a0d01bf" Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.963803 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-329f-account-create-update-qxhhn" event={"ID":"fe206186-2715-4930-abb1-5917419fd021","Type":"ContainerDied","Data":"397336bb7b2a138d225f1f96b37aa4c1e7770cd251b30df194ae410de856a073"} Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.963840 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="397336bb7b2a138d225f1f96b37aa4c1e7770cd251b30df194ae410de856a073" Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.963895 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-329f-account-create-update-qxhhn" Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.977912 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2tzqs" Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.983698 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v78qk" Mar 20 15:11:07 crc kubenswrapper[4764]: I0320 15:11:07.995667 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0631-account-create-update-7kmh8" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.046364 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-88d7-account-create-update-p5wpr" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.053084 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7fdgf" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.076369 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe206186-2715-4930-abb1-5917419fd021-operator-scripts\") pod \"fe206186-2715-4930-abb1-5917419fd021\" (UID: \"fe206186-2715-4930-abb1-5917419fd021\") " Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.076455 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xktwp\" (UniqueName: \"kubernetes.io/projected/fe206186-2715-4930-abb1-5917419fd021-kube-api-access-xktwp\") pod \"fe206186-2715-4930-abb1-5917419fd021\" (UID: \"fe206186-2715-4930-abb1-5917419fd021\") " Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.081054 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe206186-2715-4930-abb1-5917419fd021-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe206186-2715-4930-abb1-5917419fd021" (UID: "fe206186-2715-4930-abb1-5917419fd021"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.085120 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe206186-2715-4930-abb1-5917419fd021-kube-api-access-xktwp" (OuterVolumeSpecName: "kube-api-access-xktwp") pod "fe206186-2715-4930-abb1-5917419fd021" (UID: "fe206186-2715-4930-abb1-5917419fd021"). InnerVolumeSpecName "kube-api-access-xktwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.177339 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q5w6\" (UniqueName: \"kubernetes.io/projected/2d44ec84-1f1d-4477-8441-25159cc06b9e-kube-api-access-2q5w6\") pod \"2d44ec84-1f1d-4477-8441-25159cc06b9e\" (UID: \"2d44ec84-1f1d-4477-8441-25159cc06b9e\") " Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.177423 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a90e7d64-0e27-4264-af29-d75b18ab3156-operator-scripts\") pod \"a90e7d64-0e27-4264-af29-d75b18ab3156\" (UID: \"a90e7d64-0e27-4264-af29-d75b18ab3156\") " Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.177482 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842127c9-b79d-4787-aa8f-8717e266f790-operator-scripts\") pod \"842127c9-b79d-4787-aa8f-8717e266f790\" (UID: \"842127c9-b79d-4787-aa8f-8717e266f790\") " Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.177531 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f42236b-4110-466f-8ab4-6ebaafd5e570-operator-scripts\") pod \"0f42236b-4110-466f-8ab4-6ebaafd5e570\" (UID: \"0f42236b-4110-466f-8ab4-6ebaafd5e570\") " Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.177552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l2jm\" (UniqueName: \"kubernetes.io/projected/842127c9-b79d-4787-aa8f-8717e266f790-kube-api-access-4l2jm\") pod \"842127c9-b79d-4787-aa8f-8717e266f790\" (UID: \"842127c9-b79d-4787-aa8f-8717e266f790\") " Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.177582 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snbq5\" (UniqueName: \"kubernetes.io/projected/7a827a7e-f360-4a04-9e24-0405b61b9501-kube-api-access-snbq5\") pod \"7a827a7e-f360-4a04-9e24-0405b61b9501\" (UID: \"7a827a7e-f360-4a04-9e24-0405b61b9501\") " Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.177625 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqkk7\" (UniqueName: \"kubernetes.io/projected/0f42236b-4110-466f-8ab4-6ebaafd5e570-kube-api-access-qqkk7\") pod \"0f42236b-4110-466f-8ab4-6ebaafd5e570\" (UID: \"0f42236b-4110-466f-8ab4-6ebaafd5e570\") " Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.177668 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d44ec84-1f1d-4477-8441-25159cc06b9e-operator-scripts\") pod \"2d44ec84-1f1d-4477-8441-25159cc06b9e\" (UID: \"2d44ec84-1f1d-4477-8441-25159cc06b9e\") " Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.177693 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a827a7e-f360-4a04-9e24-0405b61b9501-operator-scripts\") pod \"7a827a7e-f360-4a04-9e24-0405b61b9501\" (UID: \"7a827a7e-f360-4a04-9e24-0405b61b9501\") " Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.177730 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c72qz\" (UniqueName: \"kubernetes.io/projected/a90e7d64-0e27-4264-af29-d75b18ab3156-kube-api-access-c72qz\") pod \"a90e7d64-0e27-4264-af29-d75b18ab3156\" (UID: \"a90e7d64-0e27-4264-af29-d75b18ab3156\") " Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.178124 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe206186-2715-4930-abb1-5917419fd021-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.178140 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xktwp\" (UniqueName: \"kubernetes.io/projected/fe206186-2715-4930-abb1-5917419fd021-kube-api-access-xktwp\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.180877 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d44ec84-1f1d-4477-8441-25159cc06b9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d44ec84-1f1d-4477-8441-25159cc06b9e" (UID: "2d44ec84-1f1d-4477-8441-25159cc06b9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.180943 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a90e7d64-0e27-4264-af29-d75b18ab3156-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a90e7d64-0e27-4264-af29-d75b18ab3156" (UID: "a90e7d64-0e27-4264-af29-d75b18ab3156"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.180991 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/842127c9-b79d-4787-aa8f-8717e266f790-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "842127c9-b79d-4787-aa8f-8717e266f790" (UID: "842127c9-b79d-4787-aa8f-8717e266f790"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.181021 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a827a7e-f360-4a04-9e24-0405b61b9501-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a827a7e-f360-4a04-9e24-0405b61b9501" (UID: "7a827a7e-f360-4a04-9e24-0405b61b9501"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.181046 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/842127c9-b79d-4787-aa8f-8717e266f790-kube-api-access-4l2jm" (OuterVolumeSpecName: "kube-api-access-4l2jm") pod "842127c9-b79d-4787-aa8f-8717e266f790" (UID: "842127c9-b79d-4787-aa8f-8717e266f790"). InnerVolumeSpecName "kube-api-access-4l2jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.181500 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f42236b-4110-466f-8ab4-6ebaafd5e570-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f42236b-4110-466f-8ab4-6ebaafd5e570" (UID: "0f42236b-4110-466f-8ab4-6ebaafd5e570"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.182295 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a90e7d64-0e27-4264-af29-d75b18ab3156-kube-api-access-c72qz" (OuterVolumeSpecName: "kube-api-access-c72qz") pod "a90e7d64-0e27-4264-af29-d75b18ab3156" (UID: "a90e7d64-0e27-4264-af29-d75b18ab3156"). InnerVolumeSpecName "kube-api-access-c72qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.182569 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a827a7e-f360-4a04-9e24-0405b61b9501-kube-api-access-snbq5" (OuterVolumeSpecName: "kube-api-access-snbq5") pod "7a827a7e-f360-4a04-9e24-0405b61b9501" (UID: "7a827a7e-f360-4a04-9e24-0405b61b9501"). InnerVolumeSpecName "kube-api-access-snbq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.183680 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d44ec84-1f1d-4477-8441-25159cc06b9e-kube-api-access-2q5w6" (OuterVolumeSpecName: "kube-api-access-2q5w6") pod "2d44ec84-1f1d-4477-8441-25159cc06b9e" (UID: "2d44ec84-1f1d-4477-8441-25159cc06b9e"). InnerVolumeSpecName "kube-api-access-2q5w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.185483 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f42236b-4110-466f-8ab4-6ebaafd5e570-kube-api-access-qqkk7" (OuterVolumeSpecName: "kube-api-access-qqkk7") pod "0f42236b-4110-466f-8ab4-6ebaafd5e570" (UID: "0f42236b-4110-466f-8ab4-6ebaafd5e570"). InnerVolumeSpecName "kube-api-access-qqkk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.280033 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842127c9-b79d-4787-aa8f-8717e266f790-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.280369 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f42236b-4110-466f-8ab4-6ebaafd5e570-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.280410 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l2jm\" (UniqueName: \"kubernetes.io/projected/842127c9-b79d-4787-aa8f-8717e266f790-kube-api-access-4l2jm\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.280430 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snbq5\" (UniqueName: \"kubernetes.io/projected/7a827a7e-f360-4a04-9e24-0405b61b9501-kube-api-access-snbq5\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.280448 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqkk7\" (UniqueName: \"kubernetes.io/projected/0f42236b-4110-466f-8ab4-6ebaafd5e570-kube-api-access-qqkk7\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.280467 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d44ec84-1f1d-4477-8441-25159cc06b9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.280485 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a827a7e-f360-4a04-9e24-0405b61b9501-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.280505 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c72qz\" (UniqueName: \"kubernetes.io/projected/a90e7d64-0e27-4264-af29-d75b18ab3156-kube-api-access-c72qz\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.280523 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q5w6\" (UniqueName: \"kubernetes.io/projected/2d44ec84-1f1d-4477-8441-25159cc06b9e-kube-api-access-2q5w6\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.280541 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a90e7d64-0e27-4264-af29-d75b18ab3156-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.443969 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.444029 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.444078 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.444854 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"474d025340a960c22301e41eab332b831f75f8273d6153efd902506c422faa11"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.444925 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://474d025340a960c22301e41eab332b831f75f8273d6153efd902506c422faa11" gracePeriod=600 Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.980785 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wmzgg" event={"ID":"b825928a-6583-4399-85ba-559a5f3081a0","Type":"ContainerStarted","Data":"77ddb731624b68f78077df83ad7fc5d98c9260a4f1def31d45218fa7894f5400"} Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.986696 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="474d025340a960c22301e41eab332b831f75f8273d6153efd902506c422faa11" exitCode=0 Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.986752 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"474d025340a960c22301e41eab332b831f75f8273d6153efd902506c422faa11"} Mar 20 15:11:08 crc kubenswrapper[4764]: I0320 15:11:08.986906 4764 scope.go:117] "RemoveContainer" containerID="99ce91acea5a3e1ed101da87e85dacfd4e4d5333d6ac9096a602d551d9d17b34" Mar 20 15:11:09 crc kubenswrapper[4764]: I0320 15:11:09.000165 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-88d7-account-create-update-p5wpr" Mar 20 15:11:09 crc kubenswrapper[4764]: I0320 15:11:09.000258 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2tzqs" Mar 20 15:11:09 crc kubenswrapper[4764]: I0320 15:11:09.003987 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0631-account-create-update-7kmh8" Mar 20 15:11:09 crc kubenswrapper[4764]: I0320 15:11:09.004447 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v78qk" Mar 20 15:11:09 crc kubenswrapper[4764]: I0320 15:11:09.004452 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"e82721f90a9038c1f6895208a5837de9d5880ca4927bdcd1b7fcc57b850ddd18"} Mar 20 15:11:09 crc kubenswrapper[4764]: I0320 15:11:09.004530 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"728cdfb35ccff2a36ce7aa50c797e6cf616090eec9f8d3e95205d710a84c6b89"} Mar 20 15:11:09 crc kubenswrapper[4764]: I0320 15:11:09.004544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"bd71fa8364fea299cb632587bb86174ac0ad32320d1ce51552fcf314bb0a65c4"} Mar 20 15:11:09 crc kubenswrapper[4764]: I0320 15:11:09.004630 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7fdgf" Mar 20 15:11:09 crc kubenswrapper[4764]: I0320 15:11:09.008072 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wmzgg" podStartSLOduration=11.60555406 podStartE2EDuration="16.008050261s" podCreationTimestamp="2026-03-20 15:10:53 +0000 UTC" firstStartedPulling="2026-03-20 15:11:03.435748319 +0000 UTC m=+1185.051937458" lastFinishedPulling="2026-03-20 15:11:07.83824452 +0000 UTC m=+1189.454433659" observedRunningTime="2026-03-20 15:11:09.005040498 +0000 UTC m=+1190.621229667" watchObservedRunningTime="2026-03-20 15:11:09.008050261 +0000 UTC m=+1190.624239390" Mar 20 15:11:11 crc kubenswrapper[4764]: I0320 15:11:11.024281 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"12d0a96258b093aee4f40f6af8a6aca80a1ed347e605a2693dc0a396877cb9c2"} Mar 20 15:11:11 crc kubenswrapper[4764]: I0320 15:11:11.027211 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"21a3f0b59f614d5cbc62ecd5f99921da1cc5cc7d77df686faf0162cff08029d2"} Mar 20 15:11:13 crc kubenswrapper[4764]: I0320 15:11:13.068477 4764 generic.go:334] "Generic (PLEG): container finished" podID="b825928a-6583-4399-85ba-559a5f3081a0" containerID="77ddb731624b68f78077df83ad7fc5d98c9260a4f1def31d45218fa7894f5400" exitCode=0 Mar 20 15:11:13 crc kubenswrapper[4764]: I0320 15:11:13.068509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wmzgg" event={"ID":"b825928a-6583-4399-85ba-559a5f3081a0","Type":"ContainerDied","Data":"77ddb731624b68f78077df83ad7fc5d98c9260a4f1def31d45218fa7894f5400"} Mar 20 15:11:13 crc kubenswrapper[4764]: I0320 15:11:13.073852 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"f6bec5d02e1c80a0c22c4b1d869026c615ee64ad7aa42e563674cf55cf32584c"} Mar 20 15:11:13 crc kubenswrapper[4764]: I0320 15:11:13.073935 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"786751df865be4a6804801dd489f50ebd44e64305c23139e6e9f7c223ad85e8e"} Mar 20 15:11:13 crc kubenswrapper[4764]: I0320 15:11:13.073984 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"eb16af4c4d622ec0279afb15c05bad9bb817ddfef0ee82e309f3fd4a069185bd"} Mar 20 15:11:13 crc kubenswrapper[4764]: I0320 15:11:13.074010 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"aa630bfca9eb0390564751597998714bb966513ba2d93e6cee4795981a723dde"} Mar 20 15:11:14 crc kubenswrapper[4764]: I0320 15:11:14.423612 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:11:14 crc kubenswrapper[4764]: I0320 15:11:14.499146 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcr6v\" (UniqueName: \"kubernetes.io/projected/b825928a-6583-4399-85ba-559a5f3081a0-kube-api-access-fcr6v\") pod \"b825928a-6583-4399-85ba-559a5f3081a0\" (UID: \"b825928a-6583-4399-85ba-559a5f3081a0\") " Mar 20 15:11:14 crc kubenswrapper[4764]: I0320 15:11:14.499225 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b825928a-6583-4399-85ba-559a5f3081a0-config-data\") pod \"b825928a-6583-4399-85ba-559a5f3081a0\" (UID: \"b825928a-6583-4399-85ba-559a5f3081a0\") " Mar 20 15:11:14 crc kubenswrapper[4764]: I0320 15:11:14.499363 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b825928a-6583-4399-85ba-559a5f3081a0-combined-ca-bundle\") pod \"b825928a-6583-4399-85ba-559a5f3081a0\" (UID: \"b825928a-6583-4399-85ba-559a5f3081a0\") " Mar 20 15:11:14 crc kubenswrapper[4764]: I0320 15:11:14.502827 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b825928a-6583-4399-85ba-559a5f3081a0-kube-api-access-fcr6v" (OuterVolumeSpecName: "kube-api-access-fcr6v") pod "b825928a-6583-4399-85ba-559a5f3081a0" (UID: "b825928a-6583-4399-85ba-559a5f3081a0"). InnerVolumeSpecName "kube-api-access-fcr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:14 crc kubenswrapper[4764]: I0320 15:11:14.523863 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b825928a-6583-4399-85ba-559a5f3081a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b825928a-6583-4399-85ba-559a5f3081a0" (UID: "b825928a-6583-4399-85ba-559a5f3081a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:14 crc kubenswrapper[4764]: I0320 15:11:14.546599 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b825928a-6583-4399-85ba-559a5f3081a0-config-data" (OuterVolumeSpecName: "config-data") pod "b825928a-6583-4399-85ba-559a5f3081a0" (UID: "b825928a-6583-4399-85ba-559a5f3081a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:14 crc kubenswrapper[4764]: I0320 15:11:14.601782 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcr6v\" (UniqueName: \"kubernetes.io/projected/b825928a-6583-4399-85ba-559a5f3081a0-kube-api-access-fcr6v\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:14 crc kubenswrapper[4764]: I0320 15:11:14.601837 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b825928a-6583-4399-85ba-559a5f3081a0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:14 crc kubenswrapper[4764]: I0320 15:11:14.601858 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b825928a-6583-4399-85ba-559a5f3081a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.100432 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"22eb3b8a11c05e60b123cb8a457fa736a45f56613d9f166766b53bf542e6d032"} Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.100917 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"615011a100bc1beaff247a129c14ff8b934e5c9219e3590b8e12dc47b6a48b86"} Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.100943 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"3b9743946ef8a1dbc3cab9ee1e9b3d1bac73f0b79b6e558bbdb21fc36f74f4c7"} Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.100963 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"b53549f5c2399a7a4ff305c3593d21a13aa1004dcde1074f33f9d4f49e3ac14d"} Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.100985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"177b6a974ba9fb45cbd29a0f187442fcff15f9331a1366ba45c8dcf4604163a8"} Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.104283 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wmzgg" event={"ID":"b825928a-6583-4399-85ba-559a5f3081a0","Type":"ContainerDied","Data":"1d459cf6c6323fa73e7b164809b2b7f53c5564f45070598594b8fd8736f1da1d"} Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.104326 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d459cf6c6323fa73e7b164809b2b7f53c5564f45070598594b8fd8736f1da1d" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.104326 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wmzgg" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.290464 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-sdgmf"] Mar 20 15:11:15 crc kubenswrapper[4764]: E0320 15:11:15.290807 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe206186-2715-4930-abb1-5917419fd021" containerName="mariadb-account-create-update" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.290824 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe206186-2715-4930-abb1-5917419fd021" containerName="mariadb-account-create-update" Mar 20 15:11:15 crc kubenswrapper[4764]: E0320 15:11:15.290843 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d44ec84-1f1d-4477-8441-25159cc06b9e" containerName="mariadb-account-create-update" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.290849 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d44ec84-1f1d-4477-8441-25159cc06b9e" containerName="mariadb-account-create-update" Mar 20 15:11:15 crc kubenswrapper[4764]: E0320 15:11:15.290860 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b825928a-6583-4399-85ba-559a5f3081a0" containerName="keystone-db-sync" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.290866 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b825928a-6583-4399-85ba-559a5f3081a0" containerName="keystone-db-sync" Mar 20 15:11:15 crc kubenswrapper[4764]: E0320 15:11:15.290881 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f42236b-4110-466f-8ab4-6ebaafd5e570" containerName="mariadb-account-create-update" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.290887 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f42236b-4110-466f-8ab4-6ebaafd5e570" containerName="mariadb-account-create-update" Mar 20 15:11:15 crc kubenswrapper[4764]: E0320 15:11:15.290898 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f" containerName="ovn-config" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.290904 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f" containerName="ovn-config" Mar 20 15:11:15 crc kubenswrapper[4764]: E0320 15:11:15.290919 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a90e7d64-0e27-4264-af29-d75b18ab3156" containerName="mariadb-database-create" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.290925 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a90e7d64-0e27-4264-af29-d75b18ab3156" containerName="mariadb-database-create" Mar 20 15:11:15 crc kubenswrapper[4764]: E0320 15:11:15.290939 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a827a7e-f360-4a04-9e24-0405b61b9501" containerName="mariadb-database-create" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.290947 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a827a7e-f360-4a04-9e24-0405b61b9501" containerName="mariadb-database-create" Mar 20 15:11:15 crc kubenswrapper[4764]: E0320 15:11:15.290964 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="842127c9-b79d-4787-aa8f-8717e266f790" containerName="mariadb-database-create" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.290971 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="842127c9-b79d-4787-aa8f-8717e266f790" containerName="mariadb-database-create" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.291144 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="842127c9-b79d-4787-aa8f-8717e266f790" containerName="mariadb-database-create" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.291156 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a90e7d64-0e27-4264-af29-d75b18ab3156" containerName="mariadb-database-create" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.291167 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b825928a-6583-4399-85ba-559a5f3081a0" containerName="keystone-db-sync" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.291176 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe206186-2715-4930-abb1-5917419fd021" containerName="mariadb-account-create-update" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.291187 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d44ec84-1f1d-4477-8441-25159cc06b9e" containerName="mariadb-account-create-update" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.291198 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e45cbee-7c7e-4bd7-89f4-f49d6c044a6f" containerName="ovn-config" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.291209 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f42236b-4110-466f-8ab4-6ebaafd5e570" containerName="mariadb-account-create-update" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.291216 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a827a7e-f360-4a04-9e24-0405b61b9501" containerName="mariadb-database-create" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.292181 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.299886 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-sdgmf"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.311884 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xjvs\" (UniqueName: \"kubernetes.io/projected/26db7af0-c1a6-4af8-9264-4129bb84ab98-kube-api-access-4xjvs\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.311936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-dns-svc\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.311965 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.311981 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-config\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.312030 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.333287 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-g6d25"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.334214 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.339067 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.339329 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rhrt4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.339493 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.339606 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.342263 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.356253 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g6d25"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.412935 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-dns-svc\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.412985 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.413001 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-config\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.413040 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.413062 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-config-data\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.413081 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-scripts\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.413102 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-combined-ca-bundle\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.413137 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-credential-keys\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.413178 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xjvs\" (UniqueName: \"kubernetes.io/projected/26db7af0-c1a6-4af8-9264-4129bb84ab98-kube-api-access-4xjvs\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.413198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4tk9\" (UniqueName: \"kubernetes.io/projected/2a4d6b5c-a622-4180-9255-f8376001de5c-kube-api-access-d4tk9\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.413216 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-fernet-keys\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.413981 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-dns-svc\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.415026 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-config\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.415068 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.415329 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.439508 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xjvs\" (UniqueName: \"kubernetes.io/projected/26db7af0-c1a6-4af8-9264-4129bb84ab98-kube-api-access-4xjvs\") pod \"dnsmasq-dns-f877ddd87-sdgmf\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.450821 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-95f45b77f-k4bfr"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.451991 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.455448 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.458702 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-hw7q4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.458870 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.458997 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.483922 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-95f45b77f-k4bfr"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.514466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4tk9\" (UniqueName: \"kubernetes.io/projected/2a4d6b5c-a622-4180-9255-f8376001de5c-kube-api-access-d4tk9\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.514509 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9019e60a-b910-45e2-8c8c-62a0c8982cc4-horizon-secret-key\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.514541 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9019e60a-b910-45e2-8c8c-62a0c8982cc4-config-data\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.514565 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-fernet-keys\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.514615 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpkh2\" (UniqueName: \"kubernetes.io/projected/9019e60a-b910-45e2-8c8c-62a0c8982cc4-kube-api-access-jpkh2\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.514657 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9019e60a-b910-45e2-8c8c-62a0c8982cc4-logs\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.514688 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9019e60a-b910-45e2-8c8c-62a0c8982cc4-scripts\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.514717 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-config-data\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.514741 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-scripts\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.514762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-combined-ca-bundle\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.514810 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-credential-keys\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.520318 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-fernet-keys\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.520913 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-scripts\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.522546 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-combined-ca-bundle\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.526069 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-config-data\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.527427 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6zj6m"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.528193 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-credential-keys\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.528416 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.532770 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.533082 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.533268 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hq7b8" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.552949 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-p5v7z"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.554274 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.556035 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.556230 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.558249 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cn5fs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.564274 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4tk9\" (UniqueName: \"kubernetes.io/projected/2a4d6b5c-a622-4180-9255-f8376001de5c-kube-api-access-d4tk9\") pod \"keystone-bootstrap-g6d25\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.565328 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6zj6m"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.594799 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p5v7z"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615148 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615402 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9019e60a-b910-45e2-8c8c-62a0c8982cc4-horizon-secret-key\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615434 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9019e60a-b910-45e2-8c8c-62a0c8982cc4-config-data\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615458 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-scripts\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615491 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpkh2\" (UniqueName: \"kubernetes.io/projected/9019e60a-b910-45e2-8c8c-62a0c8982cc4-kube-api-access-jpkh2\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615519 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-config-data\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615544 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9019e60a-b910-45e2-8c8c-62a0c8982cc4-logs\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5rpc\" (UniqueName: \"kubernetes.io/projected/337e2278-00e7-428e-97c1-c8d940d83aa4-kube-api-access-f5rpc\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615575 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmd9s\" (UniqueName: \"kubernetes.io/projected/8d3ec055-05fb-42c0-bb97-342be3f1e32d-kube-api-access-xmd9s\") pod \"neutron-db-sync-p5v7z\" (UID: \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\") " pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/337e2278-00e7-428e-97c1-c8d940d83aa4-etc-machine-id\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615609 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9019e60a-b910-45e2-8c8c-62a0c8982cc4-scripts\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615634 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d3ec055-05fb-42c0-bb97-342be3f1e32d-config\") pod \"neutron-db-sync-p5v7z\" (UID: \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\") " pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615656 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-db-sync-config-data\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615673 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d3ec055-05fb-42c0-bb97-342be3f1e32d-combined-ca-bundle\") pod \"neutron-db-sync-p5v7z\" (UID: \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\") " pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.615691 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-combined-ca-bundle\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.616199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9019e60a-b910-45e2-8c8c-62a0c8982cc4-logs\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.616674 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9019e60a-b910-45e2-8c8c-62a0c8982cc4-scripts\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.617533 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9019e60a-b910-45e2-8c8c-62a0c8982cc4-config-data\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.622561 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9019e60a-b910-45e2-8c8c-62a0c8982cc4-horizon-secret-key\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.622613 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-q4nl4"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.623711 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.625835 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.625973 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-l8z6t" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.644518 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpkh2\" (UniqueName: \"kubernetes.io/projected/9019e60a-b910-45e2-8c8c-62a0c8982cc4-kube-api-access-jpkh2\") pod \"horizon-95f45b77f-k4bfr\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.649010 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-q4nl4"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.651711 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.659912 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5557c86df7-xp2rs"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.662558 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.721320 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5557c86df7-xp2rs"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.733978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72ea7eea-19f9-4957-8cf2-7407a45137f2-scripts\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-scripts\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734071 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72ea7eea-19f9-4957-8cf2-7407a45137f2-horizon-secret-key\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734124 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329bd08e-9bf1-4c6e-b234-e99022daa848-combined-ca-bundle\") pod \"barbican-db-sync-q4nl4\" (UID: \"329bd08e-9bf1-4c6e-b234-e99022daa848\") " pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734159 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72ea7eea-19f9-4957-8cf2-7407a45137f2-config-data\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734194 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z28rx\" (UniqueName: \"kubernetes.io/projected/72ea7eea-19f9-4957-8cf2-7407a45137f2-kube-api-access-z28rx\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734221 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-config-data\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734276 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5rpc\" (UniqueName: \"kubernetes.io/projected/337e2278-00e7-428e-97c1-c8d940d83aa4-kube-api-access-f5rpc\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734295 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmd9s\" (UniqueName: \"kubernetes.io/projected/8d3ec055-05fb-42c0-bb97-342be3f1e32d-kube-api-access-xmd9s\") pod \"neutron-db-sync-p5v7z\" (UID: \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\") " pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734326 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/337e2278-00e7-428e-97c1-c8d940d83aa4-etc-machine-id\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734388 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d3ec055-05fb-42c0-bb97-342be3f1e32d-config\") pod \"neutron-db-sync-p5v7z\" (UID: \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\") " pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734425 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-db-sync-config-data\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734446 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d3ec055-05fb-42c0-bb97-342be3f1e32d-combined-ca-bundle\") pod \"neutron-db-sync-p5v7z\" (UID: \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\") " pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734477 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-combined-ca-bundle\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734494 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72ea7eea-19f9-4957-8cf2-7407a45137f2-logs\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734544 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/329bd08e-9bf1-4c6e-b234-e99022daa848-db-sync-config-data\") pod \"barbican-db-sync-q4nl4\" (UID: \"329bd08e-9bf1-4c6e-b234-e99022daa848\") " pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.734609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64dx5\" (UniqueName: \"kubernetes.io/projected/329bd08e-9bf1-4c6e-b234-e99022daa848-kube-api-access-64dx5\") pod \"barbican-db-sync-q4nl4\" (UID: \"329bd08e-9bf1-4c6e-b234-e99022daa848\") " pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.737922 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/337e2278-00e7-428e-97c1-c8d940d83aa4-etc-machine-id\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.743594 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-scripts\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.743676 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d3ec055-05fb-42c0-bb97-342be3f1e32d-combined-ca-bundle\") pod \"neutron-db-sync-p5v7z\" (UID: \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\") " pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.750007 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-db-sync-config-data\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.756428 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-config-data\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.758800 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmd9s\" (UniqueName: \"kubernetes.io/projected/8d3ec055-05fb-42c0-bb97-342be3f1e32d-kube-api-access-xmd9s\") pod \"neutron-db-sync-p5v7z\" (UID: \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\") " pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.760739 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d3ec055-05fb-42c0-bb97-342be3f1e32d-config\") pod \"neutron-db-sync-p5v7z\" (UID: \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\") " pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.760826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5rpc\" (UniqueName: \"kubernetes.io/projected/337e2278-00e7-428e-97c1-c8d940d83aa4-kube-api-access-f5rpc\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.761227 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-combined-ca-bundle\") pod \"cinder-db-sync-6zj6m\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.797538 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-sdgmf"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.817923 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-64fhj"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.821530 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839187 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72ea7eea-19f9-4957-8cf2-7407a45137f2-logs\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839234 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839272 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j286\" (UniqueName: \"kubernetes.io/projected/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-kube-api-access-2j286\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/329bd08e-9bf1-4c6e-b234-e99022daa848-db-sync-config-data\") pod \"barbican-db-sync-q4nl4\" (UID: \"329bd08e-9bf1-4c6e-b234-e99022daa848\") " pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839324 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64dx5\" (UniqueName: \"kubernetes.io/projected/329bd08e-9bf1-4c6e-b234-e99022daa848-kube-api-access-64dx5\") pod \"barbican-db-sync-q4nl4\" (UID: \"329bd08e-9bf1-4c6e-b234-e99022daa848\") " pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72ea7eea-19f9-4957-8cf2-7407a45137f2-scripts\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839394 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72ea7eea-19f9-4957-8cf2-7407a45137f2-horizon-secret-key\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839420 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329bd08e-9bf1-4c6e-b234-e99022daa848-combined-ca-bundle\") pod \"barbican-db-sync-q4nl4\" (UID: \"329bd08e-9bf1-4c6e-b234-e99022daa848\") " pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839437 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72ea7eea-19f9-4957-8cf2-7407a45137f2-config-data\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839459 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z28rx\" (UniqueName: \"kubernetes.io/projected/72ea7eea-19f9-4957-8cf2-7407a45137f2-kube-api-access-z28rx\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839486 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839512 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.839533 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-config\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.840018 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72ea7eea-19f9-4957-8cf2-7407a45137f2-logs\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.842327 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72ea7eea-19f9-4957-8cf2-7407a45137f2-scripts\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.843242 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72ea7eea-19f9-4957-8cf2-7407a45137f2-config-data\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.843837 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.862111 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329bd08e-9bf1-4c6e-b234-e99022daa848-combined-ca-bundle\") pod \"barbican-db-sync-q4nl4\" (UID: \"329bd08e-9bf1-4c6e-b234-e99022daa848\") " pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.862607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72ea7eea-19f9-4957-8cf2-7407a45137f2-horizon-secret-key\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.864963 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-64fhj"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.873926 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/329bd08e-9bf1-4c6e-b234-e99022daa848-db-sync-config-data\") pod \"barbican-db-sync-q4nl4\" (UID: \"329bd08e-9bf1-4c6e-b234-e99022daa848\") " pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.891810 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64dx5\" (UniqueName: \"kubernetes.io/projected/329bd08e-9bf1-4c6e-b234-e99022daa848-kube-api-access-64dx5\") pod \"barbican-db-sync-q4nl4\" (UID: \"329bd08e-9bf1-4c6e-b234-e99022daa848\") " pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.902348 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.902903 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z28rx\" (UniqueName: \"kubernetes.io/projected/72ea7eea-19f9-4957-8cf2-7407a45137f2-kube-api-access-z28rx\") pod \"horizon-5557c86df7-xp2rs\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.906436 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-xhldv"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.915949 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.917658 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.920779 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.920948 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.921059 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-clqvv" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.942486 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.942998 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.943041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.943061 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-config\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.943100 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.943127 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j286\" (UniqueName: \"kubernetes.io/projected/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-kube-api-access-2j286\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.944124 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.945042 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.949805 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.958226 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xhldv"] Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.965686 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-config\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:15 crc kubenswrapper[4764]: I0320 15:11:15.998219 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j286\" (UniqueName: \"kubernetes.io/projected/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-kube-api-access-2j286\") pod \"dnsmasq-dns-68dcc9cf6f-64fhj\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.005437 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.007335 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.008506 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.010771 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.010954 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.042882 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.044162 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-combined-ca-bundle\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.044199 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df97d5e8-2808-4bef-9fad-b54c27554d23-logs\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.044229 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-scripts\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.044245 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxlgz\" (UniqueName: \"kubernetes.io/projected/df97d5e8-2808-4bef-9fad-b54c27554d23-kube-api-access-mxlgz\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.044263 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-config-data\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.095511 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-sdgmf"] Mar 20 15:11:16 crc kubenswrapper[4764]: W0320 15:11:16.122630 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26db7af0_c1a6_4af8_9264_4129bb84ab98.slice/crio-b553077ff2654e2dfae3082df722abc06b448567fc1fb82c56a2dece99d6f286 WatchSource:0}: Error finding container b553077ff2654e2dfae3082df722abc06b448567fc1fb82c56a2dece99d6f286: Status 404 returned error can't find the container with id b553077ff2654e2dfae3082df722abc06b448567fc1fb82c56a2dece99d6f286 Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.134810 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"5a94a2834312c3b8acde8b3718a9ce745bae8955a212ba42e8416633077c1f1a"} Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.134845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0","Type":"ContainerStarted","Data":"b498815b642e160753627a80834630309e0c3bf0d7e2a68adda8cc6a6d315c85"} Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.146399 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-scripts\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.146437 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tghv4\" (UniqueName: \"kubernetes.io/projected/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-kube-api-access-tghv4\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.146464 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.146486 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-log-httpd\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.146563 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-combined-ca-bundle\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.146583 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-run-httpd\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.146612 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df97d5e8-2808-4bef-9fad-b54c27554d23-logs\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.146642 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-scripts\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.146662 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxlgz\" (UniqueName: \"kubernetes.io/projected/df97d5e8-2808-4bef-9fad-b54c27554d23-kube-api-access-mxlgz\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.146678 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-config-data\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.146699 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-config-data\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.146738 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.147133 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df97d5e8-2808-4bef-9fad-b54c27554d23-logs\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.150448 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-config-data\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.153705 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-scripts\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.154316 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-combined-ca-bundle\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.170024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxlgz\" (UniqueName: \"kubernetes.io/projected/df97d5e8-2808-4bef-9fad-b54c27554d23-kube-api-access-mxlgz\") pod \"placement-db-sync-xhldv\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.192550 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=39.676698548 podStartE2EDuration="50.192532471s" podCreationTimestamp="2026-03-20 15:10:26 +0000 UTC" firstStartedPulling="2026-03-20 15:11:03.614621103 +0000 UTC m=+1185.230810232" lastFinishedPulling="2026-03-20 15:11:14.130455006 +0000 UTC m=+1195.746644155" observedRunningTime="2026-03-20 15:11:16.186323678 +0000 UTC m=+1197.802512807" watchObservedRunningTime="2026-03-20 15:11:16.192532471 +0000 UTC m=+1197.808721600" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.216334 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.248617 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-scripts\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.248650 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tghv4\" (UniqueName: \"kubernetes.io/projected/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-kube-api-access-tghv4\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.248674 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.248703 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-log-httpd\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.248739 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-run-httpd\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.248801 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-config-data\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.248854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.250953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-run-httpd\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.257303 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-log-httpd\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.257565 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-config-data\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.257925 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.257926 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-scripts\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.280691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tghv4\" (UniqueName: \"kubernetes.io/projected/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-kube-api-access-tghv4\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.292418 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.299108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.342223 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.357905 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-g6d25"] Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.563992 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-95f45b77f-k4bfr"] Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.571562 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-64fhj"] Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.604333 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-n6p2n"] Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.605705 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.613822 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.628167 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-n6p2n"] Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.758839 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-config\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.759304 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvptj\" (UniqueName: \"kubernetes.io/projected/be95591c-8398-4ba2-aa65-784bc64cc1b3-kube-api-access-lvptj\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.759418 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.759525 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.759633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.759737 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.819028 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-q4nl4"] Mar 20 15:11:16 crc kubenswrapper[4764]: W0320 15:11:16.827768 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod329bd08e_9bf1_4c6e_b234_e99022daa848.slice/crio-e3b00e845b20941ddd494dd29fb145f2838c3946f8129a8bb15162f807b8ba1a WatchSource:0}: Error finding container e3b00e845b20941ddd494dd29fb145f2838c3946f8129a8bb15162f807b8ba1a: Status 404 returned error can't find the container with id e3b00e845b20941ddd494dd29fb145f2838c3946f8129a8bb15162f807b8ba1a Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.831475 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5557c86df7-xp2rs"] Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.868323 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-config\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.868452 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvptj\" (UniqueName: \"kubernetes.io/projected/be95591c-8398-4ba2-aa65-784bc64cc1b3-kube-api-access-lvptj\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.868482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.868551 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.868624 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.868659 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.869609 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.869630 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.878361 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-config\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.878539 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.881075 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.893091 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvptj\" (UniqueName: \"kubernetes.io/projected/be95591c-8398-4ba2-aa65-784bc64cc1b3-kube-api-access-lvptj\") pod \"dnsmasq-dns-58dd9ff6bc-n6p2n\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:16 crc kubenswrapper[4764]: W0320 15:11:16.960353 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod337e2278_00e7_428e_97c1_c8d940d83aa4.slice/crio-841981b37f93195b61bf534bbb9e30dd532f12aed21c616624b0545667018c43 WatchSource:0}: Error finding container 841981b37f93195b61bf534bbb9e30dd532f12aed21c616624b0545667018c43: Status 404 returned error can't find the container with id 841981b37f93195b61bf534bbb9e30dd532f12aed21c616624b0545667018c43 Mar 20 15:11:16 crc kubenswrapper[4764]: I0320 15:11:16.963798 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6zj6m"] Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.076232 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.125683 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p5v7z"] Mar 20 15:11:17 crc kubenswrapper[4764]: W0320 15:11:17.143996 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3c1ae0e_05e5_4d8a_836c_18e38b0a8dbf.slice/crio-12430fc817b533b35482a0846599493f67b71c24a6c19169c80984556d4b90f7 WatchSource:0}: Error finding container 12430fc817b533b35482a0846599493f67b71c24a6c19169c80984556d4b90f7: Status 404 returned error can't find the container with id 12430fc817b533b35482a0846599493f67b71c24a6c19169c80984556d4b90f7 Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.175054 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-64fhj"] Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.175099 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xhldv"] Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.221288 4764 generic.go:334] "Generic (PLEG): container finished" podID="26db7af0-c1a6-4af8-9264-4129bb84ab98" containerID="76a3685c5df09e0e0366b72ef0319493e18a6397b3d3d4328e357f926b6f316b" exitCode=0 Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.221353 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" event={"ID":"26db7af0-c1a6-4af8-9264-4129bb84ab98","Type":"ContainerDied","Data":"76a3685c5df09e0e0366b72ef0319493e18a6397b3d3d4328e357f926b6f316b"} Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.221394 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" event={"ID":"26db7af0-c1a6-4af8-9264-4129bb84ab98","Type":"ContainerStarted","Data":"b553077ff2654e2dfae3082df722abc06b448567fc1fb82c56a2dece99d6f286"} Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.224519 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5557c86df7-xp2rs" event={"ID":"72ea7eea-19f9-4957-8cf2-7407a45137f2","Type":"ContainerStarted","Data":"04ef0cc5b3b231c923ecac2c2da4ea37cf4a7644d1924a3c6c08d4347ed41ee5"} Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.225878 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95f45b77f-k4bfr" event={"ID":"9019e60a-b910-45e2-8c8c-62a0c8982cc4","Type":"ContainerStarted","Data":"2c141e8a1c62865531b03081d7709e2c3b4d576ee4b1704e44330d9add356de7"} Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.240118 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6zj6m" event={"ID":"337e2278-00e7-428e-97c1-c8d940d83aa4","Type":"ContainerStarted","Data":"841981b37f93195b61bf534bbb9e30dd532f12aed21c616624b0545667018c43"} Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.248768 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g6d25" event={"ID":"2a4d6b5c-a622-4180-9255-f8376001de5c","Type":"ContainerStarted","Data":"8dbbb55a76c2179a8117d688f9055882c91cbebaa24df85b6b20ed45136493e3"} Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.249008 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g6d25" event={"ID":"2a4d6b5c-a622-4180-9255-f8376001de5c","Type":"ContainerStarted","Data":"fa6afeebacd38a3222921f8ce390fbb9918ab72210305c907bae66d0595673ce"} Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.264102 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q4nl4" event={"ID":"329bd08e-9bf1-4c6e-b234-e99022daa848","Type":"ContainerStarted","Data":"e3b00e845b20941ddd494dd29fb145f2838c3946f8129a8bb15162f807b8ba1a"} Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.280946 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-g6d25" podStartSLOduration=2.280930733 podStartE2EDuration="2.280930733s" podCreationTimestamp="2026-03-20 15:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:17.277715983 +0000 UTC m=+1198.893905112" watchObservedRunningTime="2026-03-20 15:11:17.280930733 +0000 UTC m=+1198.897119862" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.297493 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.641511 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-n6p2n"] Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.673035 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.796242 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-ovsdbserver-nb\") pod \"26db7af0-c1a6-4af8-9264-4129bb84ab98\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.796306 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-ovsdbserver-sb\") pod \"26db7af0-c1a6-4af8-9264-4129bb84ab98\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.796324 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-dns-svc\") pod \"26db7af0-c1a6-4af8-9264-4129bb84ab98\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.796408 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xjvs\" (UniqueName: \"kubernetes.io/projected/26db7af0-c1a6-4af8-9264-4129bb84ab98-kube-api-access-4xjvs\") pod \"26db7af0-c1a6-4af8-9264-4129bb84ab98\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.796482 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-config\") pod \"26db7af0-c1a6-4af8-9264-4129bb84ab98\" (UID: \"26db7af0-c1a6-4af8-9264-4129bb84ab98\") " Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.804261 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.809985 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5557c86df7-xp2rs"] Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.814802 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26db7af0-c1a6-4af8-9264-4129bb84ab98-kube-api-access-4xjvs" (OuterVolumeSpecName: "kube-api-access-4xjvs") pod "26db7af0-c1a6-4af8-9264-4129bb84ab98" (UID: "26db7af0-c1a6-4af8-9264-4129bb84ab98"). InnerVolumeSpecName "kube-api-access-4xjvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.830347 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "26db7af0-c1a6-4af8-9264-4129bb84ab98" (UID: "26db7af0-c1a6-4af8-9264-4129bb84ab98"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.842150 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "26db7af0-c1a6-4af8-9264-4129bb84ab98" (UID: "26db7af0-c1a6-4af8-9264-4129bb84ab98"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.846946 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-857f5dbd69-s9wj7"] Mar 20 15:11:17 crc kubenswrapper[4764]: E0320 15:11:17.847640 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26db7af0-c1a6-4af8-9264-4129bb84ab98" containerName="init" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.847659 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="26db7af0-c1a6-4af8-9264-4129bb84ab98" containerName="init" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.847840 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="26db7af0-c1a6-4af8-9264-4129bb84ab98" containerName="init" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.850366 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.866732 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "26db7af0-c1a6-4af8-9264-4129bb84ab98" (UID: "26db7af0-c1a6-4af8-9264-4129bb84ab98"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.872636 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-857f5dbd69-s9wj7"] Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.882903 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-config" (OuterVolumeSpecName: "config") pod "26db7af0-c1a6-4af8-9264-4129bb84ab98" (UID: "26db7af0-c1a6-4af8-9264-4129bb84ab98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.901457 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.901491 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.901500 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.901509 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xjvs\" (UniqueName: \"kubernetes.io/projected/26db7af0-c1a6-4af8-9264-4129bb84ab98-kube-api-access-4xjvs\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:17 crc kubenswrapper[4764]: I0320 15:11:17.901521 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26db7af0-c1a6-4af8-9264-4129bb84ab98-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.003262 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpcbp\" (UniqueName: \"kubernetes.io/projected/96075e3c-fec4-4453-89a3-ad47336d199b-kube-api-access-mpcbp\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.003669 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96075e3c-fec4-4453-89a3-ad47336d199b-config-data\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.003703 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96075e3c-fec4-4453-89a3-ad47336d199b-logs\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.003730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96075e3c-fec4-4453-89a3-ad47336d199b-horizon-secret-key\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.003770 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96075e3c-fec4-4453-89a3-ad47336d199b-scripts\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.105141 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96075e3c-fec4-4453-89a3-ad47336d199b-config-data\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.105400 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96075e3c-fec4-4453-89a3-ad47336d199b-logs\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.105449 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96075e3c-fec4-4453-89a3-ad47336d199b-horizon-secret-key\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.105501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96075e3c-fec4-4453-89a3-ad47336d199b-scripts\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.105563 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpcbp\" (UniqueName: \"kubernetes.io/projected/96075e3c-fec4-4453-89a3-ad47336d199b-kube-api-access-mpcbp\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.108034 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96075e3c-fec4-4453-89a3-ad47336d199b-config-data\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.108342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96075e3c-fec4-4453-89a3-ad47336d199b-logs\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.109829 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96075e3c-fec4-4453-89a3-ad47336d199b-scripts\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.113261 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96075e3c-fec4-4453-89a3-ad47336d199b-horizon-secret-key\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.122586 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpcbp\" (UniqueName: \"kubernetes.io/projected/96075e3c-fec4-4453-89a3-ad47336d199b-kube-api-access-mpcbp\") pod \"horizon-857f5dbd69-s9wj7\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.174183 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.273583 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-klrf6" event={"ID":"adc582c0-f416-4991-89c7-9ddb850c0f2b","Type":"ContainerStarted","Data":"1606e8ccc1db1c2c734072f286bcbebadc743142ff9a6e45d65e106544241caf"} Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.299330 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-klrf6" podStartSLOduration=3.380656785 podStartE2EDuration="34.299310049s" podCreationTimestamp="2026-03-20 15:10:44 +0000 UTC" firstStartedPulling="2026-03-20 15:10:45.826988848 +0000 UTC m=+1167.443177987" lastFinishedPulling="2026-03-20 15:11:16.745642122 +0000 UTC m=+1198.361831251" observedRunningTime="2026-03-20 15:11:18.291429945 +0000 UTC m=+1199.907619074" watchObservedRunningTime="2026-03-20 15:11:18.299310049 +0000 UTC m=+1199.915499178" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.307688 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2","Type":"ContainerStarted","Data":"dd534bf1da85441e2afcdbebd3bee5022c399a2f3537f61e41f39888b21f7ac2"} Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.312235 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5v7z" event={"ID":"8d3ec055-05fb-42c0-bb97-342be3f1e32d","Type":"ContainerStarted","Data":"8eff8c0ac05e54673fbf6c7e2b62165935fca4bb2a85b981663fe393db72cd8f"} Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.312269 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5v7z" event={"ID":"8d3ec055-05fb-42c0-bb97-342be3f1e32d","Type":"ContainerStarted","Data":"06b00029b4fc0c4b54f8c6503dd323304471901ef0b1cccbc52e475be9f9ca7c"} Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.324320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" event={"ID":"26db7af0-c1a6-4af8-9264-4129bb84ab98","Type":"ContainerDied","Data":"b553077ff2654e2dfae3082df722abc06b448567fc1fb82c56a2dece99d6f286"} Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.324394 4764 scope.go:117] "RemoveContainer" containerID="76a3685c5df09e0e0366b72ef0319493e18a6397b3d3d4328e357f926b6f316b" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.324487 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-sdgmf" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.327041 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-p5v7z" podStartSLOduration=3.327031047 podStartE2EDuration="3.327031047s" podCreationTimestamp="2026-03-20 15:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:18.323154797 +0000 UTC m=+1199.939343916" watchObservedRunningTime="2026-03-20 15:11:18.327031047 +0000 UTC m=+1199.943220196" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.333797 4764 generic.go:334] "Generic (PLEG): container finished" podID="a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf" containerID="1303ad51cfc7e1080ae3ee8ea986ef23e228c3f2eac86342f77996fdc13d7c50" exitCode=0 Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.333857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" event={"ID":"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf","Type":"ContainerDied","Data":"1303ad51cfc7e1080ae3ee8ea986ef23e228c3f2eac86342f77996fdc13d7c50"} Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.333882 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" event={"ID":"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf","Type":"ContainerStarted","Data":"12430fc817b533b35482a0846599493f67b71c24a6c19169c80984556d4b90f7"} Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.338221 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xhldv" event={"ID":"df97d5e8-2808-4bef-9fad-b54c27554d23","Type":"ContainerStarted","Data":"62366ea3c949a7ec430f3ccb32643621d398bae0b8b306ee5fc07bcb7528ce26"} Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.339944 4764 generic.go:334] "Generic (PLEG): container finished" podID="be95591c-8398-4ba2-aa65-784bc64cc1b3" containerID="c844da4347390dad93cd84b1871b34563fdf8d02bc9da84dd749258aa59db671" exitCode=0 Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.340952 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" event={"ID":"be95591c-8398-4ba2-aa65-784bc64cc1b3","Type":"ContainerDied","Data":"c844da4347390dad93cd84b1871b34563fdf8d02bc9da84dd749258aa59db671"} Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.340980 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" event={"ID":"be95591c-8398-4ba2-aa65-784bc64cc1b3","Type":"ContainerStarted","Data":"564fb25c13dab5e6a6f8d2f44e033c39735f8d5b662320f47b7af2de9f9b402d"} Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.526994 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-sdgmf"] Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.542229 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-sdgmf"] Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.763707 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.817108 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-857f5dbd69-s9wj7"] Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.829143 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-ovsdbserver-nb\") pod \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.829182 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j286\" (UniqueName: \"kubernetes.io/projected/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-kube-api-access-2j286\") pod \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.829223 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-dns-svc\") pod \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.829345 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-ovsdbserver-sb\") pod \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.829406 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-config\") pod \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\" (UID: \"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf\") " Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.835850 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-kube-api-access-2j286" (OuterVolumeSpecName: "kube-api-access-2j286") pod "a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf" (UID: "a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf"). InnerVolumeSpecName "kube-api-access-2j286". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.860053 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf" (UID: "a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.860777 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-config" (OuterVolumeSpecName: "config") pod "a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf" (UID: "a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.863705 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf" (UID: "a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.876724 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf" (UID: "a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.931685 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.931717 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.931726 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.931736 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j286\" (UniqueName: \"kubernetes.io/projected/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-kube-api-access-2j286\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:18 crc kubenswrapper[4764]: I0320 15:11:18.931748 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:19 crc kubenswrapper[4764]: I0320 15:11:19.161084 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26db7af0-c1a6-4af8-9264-4129bb84ab98" path="/var/lib/kubelet/pods/26db7af0-c1a6-4af8-9264-4129bb84ab98/volumes" Mar 20 15:11:19 crc kubenswrapper[4764]: I0320 15:11:19.389172 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-857f5dbd69-s9wj7" event={"ID":"96075e3c-fec4-4453-89a3-ad47336d199b","Type":"ContainerStarted","Data":"e2c88bd93fb75c283abbe79be2f7df8507206961fc68ac4c3ae3c626ee1eb3c3"} Mar 20 15:11:19 crc kubenswrapper[4764]: I0320 15:11:19.397207 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" event={"ID":"a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf","Type":"ContainerDied","Data":"12430fc817b533b35482a0846599493f67b71c24a6c19169c80984556d4b90f7"} Mar 20 15:11:19 crc kubenswrapper[4764]: I0320 15:11:19.397256 4764 scope.go:117] "RemoveContainer" containerID="1303ad51cfc7e1080ae3ee8ea986ef23e228c3f2eac86342f77996fdc13d7c50" Mar 20 15:11:19 crc kubenswrapper[4764]: I0320 15:11:19.397340 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-64fhj" Mar 20 15:11:19 crc kubenswrapper[4764]: I0320 15:11:19.402286 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" event={"ID":"be95591c-8398-4ba2-aa65-784bc64cc1b3","Type":"ContainerStarted","Data":"02e7741578177972eef3b20819a82b9e605a6acb90680547cb62400d919c277d"} Mar 20 15:11:19 crc kubenswrapper[4764]: I0320 15:11:19.403310 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:19 crc kubenswrapper[4764]: I0320 15:11:19.441339 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-64fhj"] Mar 20 15:11:19 crc kubenswrapper[4764]: I0320 15:11:19.449193 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-64fhj"] Mar 20 15:11:19 crc kubenswrapper[4764]: I0320 15:11:19.457534 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" podStartSLOduration=3.457518671 podStartE2EDuration="3.457518671s" podCreationTimestamp="2026-03-20 15:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:19.453245839 +0000 UTC m=+1201.069434978" watchObservedRunningTime="2026-03-20 15:11:19.457518671 +0000 UTC m=+1201.073707800" Mar 20 15:11:21 crc kubenswrapper[4764]: I0320 15:11:21.139290 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf" path="/var/lib/kubelet/pods/a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf/volumes" Mar 20 15:11:22 crc kubenswrapper[4764]: I0320 15:11:22.444416 4764 generic.go:334] "Generic (PLEG): container finished" podID="2a4d6b5c-a622-4180-9255-f8376001de5c" containerID="8dbbb55a76c2179a8117d688f9055882c91cbebaa24df85b6b20ed45136493e3" exitCode=0 Mar 20 15:11:22 crc kubenswrapper[4764]: I0320 15:11:22.444442 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g6d25" event={"ID":"2a4d6b5c-a622-4180-9255-f8376001de5c","Type":"ContainerDied","Data":"8dbbb55a76c2179a8117d688f9055882c91cbebaa24df85b6b20ed45136493e3"} Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.348579 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-95f45b77f-k4bfr"] Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.367821 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-66899c9d8-zh5gp"] Mar 20 15:11:24 crc kubenswrapper[4764]: E0320 15:11:24.368178 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf" containerName="init" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.368195 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf" containerName="init" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.368363 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c1ae0e-05e5-4d8a-836c-18e38b0a8dbf" containerName="init" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.372804 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.374789 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.395469 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66899c9d8-zh5gp"] Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.410056 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-857f5dbd69-s9wj7"] Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.454178 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-horizon-tls-certs\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.454263 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-combined-ca-bundle\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.454362 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e532b989-f73c-49a1-b4f2-43322246a71e-scripts\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.454420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e532b989-f73c-49a1-b4f2-43322246a71e-config-data\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.454572 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e532b989-f73c-49a1-b4f2-43322246a71e-logs\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.454608 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-horizon-secret-key\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.454655 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfk59\" (UniqueName: \"kubernetes.io/projected/e532b989-f73c-49a1-b4f2-43322246a71e-kube-api-access-bfk59\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.470739 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-655785589d-5cnb4"] Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.472616 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.494789 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-655785589d-5cnb4"] Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556203 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb148fab-0227-4725-af4e-d6dba5740303-horizon-tls-certs\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556247 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdf7q\" (UniqueName: \"kubernetes.io/projected/cb148fab-0227-4725-af4e-d6dba5740303-kube-api-access-gdf7q\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556289 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb148fab-0227-4725-af4e-d6dba5740303-combined-ca-bundle\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556342 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-horizon-tls-certs\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556366 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb148fab-0227-4725-af4e-d6dba5740303-config-data\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556410 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-combined-ca-bundle\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e532b989-f73c-49a1-b4f2-43322246a71e-scripts\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e532b989-f73c-49a1-b4f2-43322246a71e-config-data\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556499 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb148fab-0227-4725-af4e-d6dba5740303-logs\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556528 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e532b989-f73c-49a1-b4f2-43322246a71e-logs\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556544 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-horizon-secret-key\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556564 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cb148fab-0227-4725-af4e-d6dba5740303-horizon-secret-key\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb148fab-0227-4725-af4e-d6dba5740303-scripts\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.556599 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfk59\" (UniqueName: \"kubernetes.io/projected/e532b989-f73c-49a1-b4f2-43322246a71e-kube-api-access-bfk59\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.557536 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e532b989-f73c-49a1-b4f2-43322246a71e-scripts\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.557648 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e532b989-f73c-49a1-b4f2-43322246a71e-logs\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.561292 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e532b989-f73c-49a1-b4f2-43322246a71e-config-data\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.567102 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-horizon-secret-key\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.567193 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-combined-ca-bundle\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.583208 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-horizon-tls-certs\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.591157 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfk59\" (UniqueName: \"kubernetes.io/projected/e532b989-f73c-49a1-b4f2-43322246a71e-kube-api-access-bfk59\") pod \"horizon-66899c9d8-zh5gp\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.658451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb148fab-0227-4725-af4e-d6dba5740303-logs\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.658513 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cb148fab-0227-4725-af4e-d6dba5740303-horizon-secret-key\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.658534 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb148fab-0227-4725-af4e-d6dba5740303-scripts\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.658562 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb148fab-0227-4725-af4e-d6dba5740303-horizon-tls-certs\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.658580 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdf7q\" (UniqueName: \"kubernetes.io/projected/cb148fab-0227-4725-af4e-d6dba5740303-kube-api-access-gdf7q\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.658638 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb148fab-0227-4725-af4e-d6dba5740303-combined-ca-bundle\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.658684 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb148fab-0227-4725-af4e-d6dba5740303-config-data\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.659501 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb148fab-0227-4725-af4e-d6dba5740303-logs\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.659857 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb148fab-0227-4725-af4e-d6dba5740303-scripts\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.660156 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb148fab-0227-4725-af4e-d6dba5740303-config-data\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.664168 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb148fab-0227-4725-af4e-d6dba5740303-combined-ca-bundle\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.670792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb148fab-0227-4725-af4e-d6dba5740303-horizon-tls-certs\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.675706 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cb148fab-0227-4725-af4e-d6dba5740303-horizon-secret-key\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.681102 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdf7q\" (UniqueName: \"kubernetes.io/projected/cb148fab-0227-4725-af4e-d6dba5740303-kube-api-access-gdf7q\") pod \"horizon-655785589d-5cnb4\" (UID: \"cb148fab-0227-4725-af4e-d6dba5740303\") " pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.750745 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:24 crc kubenswrapper[4764]: I0320 15:11:24.785718 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:27 crc kubenswrapper[4764]: I0320 15:11:27.077544 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:11:27 crc kubenswrapper[4764]: I0320 15:11:27.154286 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2jc7b"] Mar 20 15:11:27 crc kubenswrapper[4764]: I0320 15:11:27.154538 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-2jc7b" podUID="dd895e6c-1154-42f9-8490-b3832a9f815e" containerName="dnsmasq-dns" containerID="cri-o://b5f0eb3eac9afc6384858b2bbbbaf2ec68d8c0a38d04117b016885c0de25ad70" gracePeriod=10 Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.241157 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.325011 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-combined-ca-bundle\") pod \"2a4d6b5c-a622-4180-9255-f8376001de5c\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.325081 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-scripts\") pod \"2a4d6b5c-a622-4180-9255-f8376001de5c\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.325201 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4tk9\" (UniqueName: \"kubernetes.io/projected/2a4d6b5c-a622-4180-9255-f8376001de5c-kube-api-access-d4tk9\") pod \"2a4d6b5c-a622-4180-9255-f8376001de5c\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.325235 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-credential-keys\") pod \"2a4d6b5c-a622-4180-9255-f8376001de5c\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.325521 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-config-data\") pod \"2a4d6b5c-a622-4180-9255-f8376001de5c\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.325558 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-fernet-keys\") pod \"2a4d6b5c-a622-4180-9255-f8376001de5c\" (UID: \"2a4d6b5c-a622-4180-9255-f8376001de5c\") " Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.339697 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2a4d6b5c-a622-4180-9255-f8376001de5c" (UID: "2a4d6b5c-a622-4180-9255-f8376001de5c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.339812 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-scripts" (OuterVolumeSpecName: "scripts") pod "2a4d6b5c-a622-4180-9255-f8376001de5c" (UID: "2a4d6b5c-a622-4180-9255-f8376001de5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.339852 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2a4d6b5c-a622-4180-9255-f8376001de5c" (UID: "2a4d6b5c-a622-4180-9255-f8376001de5c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.348321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a4d6b5c-a622-4180-9255-f8376001de5c-kube-api-access-d4tk9" (OuterVolumeSpecName: "kube-api-access-d4tk9") pod "2a4d6b5c-a622-4180-9255-f8376001de5c" (UID: "2a4d6b5c-a622-4180-9255-f8376001de5c"). InnerVolumeSpecName "kube-api-access-d4tk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.355522 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-config-data" (OuterVolumeSpecName: "config-data") pod "2a4d6b5c-a622-4180-9255-f8376001de5c" (UID: "2a4d6b5c-a622-4180-9255-f8376001de5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.367598 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a4d6b5c-a622-4180-9255-f8376001de5c" (UID: "2a4d6b5c-a622-4180-9255-f8376001de5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.429365 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.429448 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.429465 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4tk9\" (UniqueName: \"kubernetes.io/projected/2a4d6b5c-a622-4180-9255-f8376001de5c-kube-api-access-d4tk9\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.429482 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.429494 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.429504 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a4d6b5c-a622-4180-9255-f8376001de5c-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.504828 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-g6d25" event={"ID":"2a4d6b5c-a622-4180-9255-f8376001de5c","Type":"ContainerDied","Data":"fa6afeebacd38a3222921f8ce390fbb9918ab72210305c907bae66d0595673ce"} Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.504890 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa6afeebacd38a3222921f8ce390fbb9918ab72210305c907bae66d0595673ce" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.504894 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-g6d25" Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.507909 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd895e6c-1154-42f9-8490-b3832a9f815e" containerID="b5f0eb3eac9afc6384858b2bbbbaf2ec68d8c0a38d04117b016885c0de25ad70" exitCode=0 Mar 20 15:11:28 crc kubenswrapper[4764]: I0320 15:11:28.507942 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2jc7b" event={"ID":"dd895e6c-1154-42f9-8490-b3832a9f815e","Type":"ContainerDied","Data":"b5f0eb3eac9afc6384858b2bbbbaf2ec68d8c0a38d04117b016885c0de25ad70"} Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.334912 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-g6d25"] Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.341814 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-g6d25"] Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.430871 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8mz56"] Mar 20 15:11:29 crc kubenswrapper[4764]: E0320 15:11:29.431207 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a4d6b5c-a622-4180-9255-f8376001de5c" containerName="keystone-bootstrap" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.431230 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a4d6b5c-a622-4180-9255-f8376001de5c" containerName="keystone-bootstrap" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.431449 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a4d6b5c-a622-4180-9255-f8376001de5c" containerName="keystone-bootstrap" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.431984 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.434115 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rhrt4" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.434775 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.434947 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.435796 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.440999 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.443670 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8mz56"] Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.547776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-fernet-keys\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.547836 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-config-data\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.547877 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-combined-ca-bundle\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.548012 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-credential-keys\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.548151 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-scripts\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.548194 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmrzw\" (UniqueName: \"kubernetes.io/projected/7a8a0840-2a17-42d6-94e5-19653a16ff80-kube-api-access-dmrzw\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.649631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-scripts\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.649685 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmrzw\" (UniqueName: \"kubernetes.io/projected/7a8a0840-2a17-42d6-94e5-19653a16ff80-kube-api-access-dmrzw\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.649771 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-fernet-keys\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.649837 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-config-data\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.649889 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-combined-ca-bundle\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.649923 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-credential-keys\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.653955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-scripts\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.654057 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-fernet-keys\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.654129 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-config-data\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.654608 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-credential-keys\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.654837 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-combined-ca-bundle\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.672959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmrzw\" (UniqueName: \"kubernetes.io/projected/7a8a0840-2a17-42d6-94e5-19653a16ff80-kube-api-access-dmrzw\") pod \"keystone-bootstrap-8mz56\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:29 crc kubenswrapper[4764]: I0320 15:11:29.751244 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:31 crc kubenswrapper[4764]: I0320 15:11:31.151534 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a4d6b5c-a622-4180-9255-f8376001de5c" path="/var/lib/kubelet/pods/2a4d6b5c-a622-4180-9255-f8376001de5c/volumes" Mar 20 15:11:32 crc kubenswrapper[4764]: I0320 15:11:32.091065 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-2jc7b" podUID="dd895e6c-1154-42f9-8490-b3832a9f815e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: connect: connection refused" Mar 20 15:11:34 crc kubenswrapper[4764]: E0320 15:11:34.590066 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 15:11:34 crc kubenswrapper[4764]: E0320 15:11:34.590617 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n646h58bh5b5h74h5c4h598h86h57fh57bh65chf5h58fh98h5d9h7h7ch6fhd4h666h9bhd6h644h698h64bh578h5f4h8hbch559hfh99hddq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z28rx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5557c86df7-xp2rs_openstack(72ea7eea-19f9-4957-8cf2-7407a45137f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:11:34 crc kubenswrapper[4764]: E0320 15:11:34.597836 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5557c86df7-xp2rs" podUID="72ea7eea-19f9-4957-8cf2-7407a45137f2" Mar 20 15:11:34 crc kubenswrapper[4764]: E0320 15:11:34.602001 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 15:11:34 crc kubenswrapper[4764]: E0320 15:11:34.602638 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 15:11:34 crc kubenswrapper[4764]: E0320 15:11:34.602727 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n56dh96h8ch578h657h8ch59fh5f8hd5h5dbhd8h94h9fhd7h688h578hf8h58fh56fh95h94h58bhfchbfh5c6h55ch5c5h64bhcfh64dhd4h68dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jpkh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-95f45b77f-k4bfr_openstack(9019e60a-b910-45e2-8c8c-62a0c8982cc4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:11:34 crc kubenswrapper[4764]: E0320 15:11:34.602828 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n698h68fh6chcch5b5h588h686h5f6h5bdh5f6h5d9hdbh67fh5b6hfdh584h5b4h9chfh559h88h5d5hcbh549h5c4hcbh656h5d7h5fh687h586h85q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mpcbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-857f5dbd69-s9wj7_openstack(96075e3c-fec4-4453-89a3-ad47336d199b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:11:34 crc kubenswrapper[4764]: E0320 15:11:34.604866 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-857f5dbd69-s9wj7" podUID="96075e3c-fec4-4453-89a3-ad47336d199b" Mar 20 15:11:34 crc kubenswrapper[4764]: E0320 15:11:34.605107 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-95f45b77f-k4bfr" podUID="9019e60a-b910-45e2-8c8c-62a0c8982cc4" Mar 20 15:11:35 crc kubenswrapper[4764]: I0320 15:11:35.566205 4764 generic.go:334] "Generic (PLEG): container finished" podID="adc582c0-f416-4991-89c7-9ddb850c0f2b" containerID="1606e8ccc1db1c2c734072f286bcbebadc743142ff9a6e45d65e106544241caf" exitCode=0 Mar 20 15:11:35 crc kubenswrapper[4764]: I0320 15:11:35.566295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-klrf6" event={"ID":"adc582c0-f416-4991-89c7-9ddb850c0f2b","Type":"ContainerDied","Data":"1606e8ccc1db1c2c734072f286bcbebadc743142ff9a6e45d65e106544241caf"} Mar 20 15:11:42 crc kubenswrapper[4764]: I0320 15:11:42.091108 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-2jc7b" podUID="dd895e6c-1154-42f9-8490-b3832a9f815e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: i/o timeout" Mar 20 15:11:43 crc kubenswrapper[4764]: E0320 15:11:43.996190 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 20 15:11:43 crc kubenswrapper[4764]: E0320 15:11:43.997195 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nd4h565h559h6h558h596h694h644h564h64ch689h67dh5dbh88h575h579h67bh5dfh86h64bh676h6chcbhddhd6hb8h687h68dh556h5dch59ch5dcq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tghv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.025753 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66899c9d8-zh5gp"] Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.131468 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.144066 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.149414 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.206946 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.213965 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-klrf6" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243239 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9019e60a-b910-45e2-8c8c-62a0c8982cc4-scripts\") pod \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243285 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-config\") pod \"dd895e6c-1154-42f9-8490-b3832a9f815e\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243336 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-dns-svc\") pod \"dd895e6c-1154-42f9-8490-b3832a9f815e\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243369 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-ovsdbserver-sb\") pod \"dd895e6c-1154-42f9-8490-b3832a9f815e\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243437 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpcbp\" (UniqueName: \"kubernetes.io/projected/96075e3c-fec4-4453-89a3-ad47336d199b-kube-api-access-mpcbp\") pod \"96075e3c-fec4-4453-89a3-ad47336d199b\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243463 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9019e60a-b910-45e2-8c8c-62a0c8982cc4-logs\") pod \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243496 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96075e3c-fec4-4453-89a3-ad47336d199b-logs\") pod \"96075e3c-fec4-4453-89a3-ad47336d199b\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96075e3c-fec4-4453-89a3-ad47336d199b-horizon-secret-key\") pod \"96075e3c-fec4-4453-89a3-ad47336d199b\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243539 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96075e3c-fec4-4453-89a3-ad47336d199b-scripts\") pod \"96075e3c-fec4-4453-89a3-ad47336d199b\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243555 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpkh2\" (UniqueName: \"kubernetes.io/projected/9019e60a-b910-45e2-8c8c-62a0c8982cc4-kube-api-access-jpkh2\") pod \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243599 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-ovsdbserver-nb\") pod \"dd895e6c-1154-42f9-8490-b3832a9f815e\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243613 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9019e60a-b910-45e2-8c8c-62a0c8982cc4-horizon-secret-key\") pod \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243648 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96075e3c-fec4-4453-89a3-ad47336d199b-config-data\") pod \"96075e3c-fec4-4453-89a3-ad47336d199b\" (UID: \"96075e3c-fec4-4453-89a3-ad47336d199b\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243670 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9019e60a-b910-45e2-8c8c-62a0c8982cc4-config-data\") pod \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\" (UID: \"9019e60a-b910-45e2-8c8c-62a0c8982cc4\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.243705 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjlzj\" (UniqueName: \"kubernetes.io/projected/dd895e6c-1154-42f9-8490-b3832a9f815e-kube-api-access-rjlzj\") pod \"dd895e6c-1154-42f9-8490-b3832a9f815e\" (UID: \"dd895e6c-1154-42f9-8490-b3832a9f815e\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.245489 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96075e3c-fec4-4453-89a3-ad47336d199b-config-data" (OuterVolumeSpecName: "config-data") pod "96075e3c-fec4-4453-89a3-ad47336d199b" (UID: "96075e3c-fec4-4453-89a3-ad47336d199b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.247966 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96075e3c-fec4-4453-89a3-ad47336d199b-scripts" (OuterVolumeSpecName: "scripts") pod "96075e3c-fec4-4453-89a3-ad47336d199b" (UID: "96075e3c-fec4-4453-89a3-ad47336d199b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.248134 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9019e60a-b910-45e2-8c8c-62a0c8982cc4-kube-api-access-jpkh2" (OuterVolumeSpecName: "kube-api-access-jpkh2") pod "9019e60a-b910-45e2-8c8c-62a0c8982cc4" (UID: "9019e60a-b910-45e2-8c8c-62a0c8982cc4"). InnerVolumeSpecName "kube-api-access-jpkh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.249773 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9019e60a-b910-45e2-8c8c-62a0c8982cc4-logs" (OuterVolumeSpecName: "logs") pod "9019e60a-b910-45e2-8c8c-62a0c8982cc4" (UID: "9019e60a-b910-45e2-8c8c-62a0c8982cc4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.251166 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96075e3c-fec4-4453-89a3-ad47336d199b-logs" (OuterVolumeSpecName: "logs") pod "96075e3c-fec4-4453-89a3-ad47336d199b" (UID: "96075e3c-fec4-4453-89a3-ad47336d199b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.252349 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96075e3c-fec4-4453-89a3-ad47336d199b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "96075e3c-fec4-4453-89a3-ad47336d199b" (UID: "96075e3c-fec4-4453-89a3-ad47336d199b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.252531 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9019e60a-b910-45e2-8c8c-62a0c8982cc4-scripts" (OuterVolumeSpecName: "scripts") pod "9019e60a-b910-45e2-8c8c-62a0c8982cc4" (UID: "9019e60a-b910-45e2-8c8c-62a0c8982cc4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.254313 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96075e3c-fec4-4453-89a3-ad47336d199b-kube-api-access-mpcbp" (OuterVolumeSpecName: "kube-api-access-mpcbp") pod "96075e3c-fec4-4453-89a3-ad47336d199b" (UID: "96075e3c-fec4-4453-89a3-ad47336d199b"). InnerVolumeSpecName "kube-api-access-mpcbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.271717 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd895e6c-1154-42f9-8490-b3832a9f815e-kube-api-access-rjlzj" (OuterVolumeSpecName: "kube-api-access-rjlzj") pod "dd895e6c-1154-42f9-8490-b3832a9f815e" (UID: "dd895e6c-1154-42f9-8490-b3832a9f815e"). InnerVolumeSpecName "kube-api-access-rjlzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.273562 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9019e60a-b910-45e2-8c8c-62a0c8982cc4-config-data" (OuterVolumeSpecName: "config-data") pod "9019e60a-b910-45e2-8c8c-62a0c8982cc4" (UID: "9019e60a-b910-45e2-8c8c-62a0c8982cc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.295906 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9019e60a-b910-45e2-8c8c-62a0c8982cc4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9019e60a-b910-45e2-8c8c-62a0c8982cc4" (UID: "9019e60a-b910-45e2-8c8c-62a0c8982cc4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.310311 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dd895e6c-1154-42f9-8490-b3832a9f815e" (UID: "dd895e6c-1154-42f9-8490-b3832a9f815e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.330147 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd895e6c-1154-42f9-8490-b3832a9f815e" (UID: "dd895e6c-1154-42f9-8490-b3832a9f815e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.333031 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dd895e6c-1154-42f9-8490-b3832a9f815e" (UID: "dd895e6c-1154-42f9-8490-b3832a9f815e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.338193 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-config" (OuterVolumeSpecName: "config") pod "dd895e6c-1154-42f9-8490-b3832a9f815e" (UID: "dd895e6c-1154-42f9-8490-b3832a9f815e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.344920 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-db-sync-config-data\") pod \"adc582c0-f416-4991-89c7-9ddb850c0f2b\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.344976 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-combined-ca-bundle\") pod \"adc582c0-f416-4991-89c7-9ddb850c0f2b\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.344996 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72ea7eea-19f9-4957-8cf2-7407a45137f2-config-data\") pod \"72ea7eea-19f9-4957-8cf2-7407a45137f2\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345067 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72ea7eea-19f9-4957-8cf2-7407a45137f2-horizon-secret-key\") pod \"72ea7eea-19f9-4957-8cf2-7407a45137f2\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345093 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z28rx\" (UniqueName: \"kubernetes.io/projected/72ea7eea-19f9-4957-8cf2-7407a45137f2-kube-api-access-z28rx\") pod \"72ea7eea-19f9-4957-8cf2-7407a45137f2\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345116 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-config-data\") pod \"adc582c0-f416-4991-89c7-9ddb850c0f2b\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345164 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72ea7eea-19f9-4957-8cf2-7407a45137f2-logs\") pod \"72ea7eea-19f9-4957-8cf2-7407a45137f2\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345185 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xskm2\" (UniqueName: \"kubernetes.io/projected/adc582c0-f416-4991-89c7-9ddb850c0f2b-kube-api-access-xskm2\") pod \"adc582c0-f416-4991-89c7-9ddb850c0f2b\" (UID: \"adc582c0-f416-4991-89c7-9ddb850c0f2b\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345246 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72ea7eea-19f9-4957-8cf2-7407a45137f2-scripts\") pod \"72ea7eea-19f9-4957-8cf2-7407a45137f2\" (UID: \"72ea7eea-19f9-4957-8cf2-7407a45137f2\") " Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345588 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjlzj\" (UniqueName: \"kubernetes.io/projected/dd895e6c-1154-42f9-8490-b3832a9f815e-kube-api-access-rjlzj\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345605 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9019e60a-b910-45e2-8c8c-62a0c8982cc4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345614 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345624 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345632 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345641 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpcbp\" (UniqueName: \"kubernetes.io/projected/96075e3c-fec4-4453-89a3-ad47336d199b-kube-api-access-mpcbp\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345649 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9019e60a-b910-45e2-8c8c-62a0c8982cc4-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345658 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96075e3c-fec4-4453-89a3-ad47336d199b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345667 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96075e3c-fec4-4453-89a3-ad47336d199b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345676 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpkh2\" (UniqueName: \"kubernetes.io/projected/9019e60a-b910-45e2-8c8c-62a0c8982cc4-kube-api-access-jpkh2\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345684 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96075e3c-fec4-4453-89a3-ad47336d199b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345692 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd895e6c-1154-42f9-8490-b3832a9f815e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345700 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9019e60a-b910-45e2-8c8c-62a0c8982cc4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345709 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96075e3c-fec4-4453-89a3-ad47336d199b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345716 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9019e60a-b910-45e2-8c8c-62a0c8982cc4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.345990 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72ea7eea-19f9-4957-8cf2-7407a45137f2-logs" (OuterVolumeSpecName: "logs") pod "72ea7eea-19f9-4957-8cf2-7407a45137f2" (UID: "72ea7eea-19f9-4957-8cf2-7407a45137f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.346113 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72ea7eea-19f9-4957-8cf2-7407a45137f2-config-data" (OuterVolumeSpecName: "config-data") pod "72ea7eea-19f9-4957-8cf2-7407a45137f2" (UID: "72ea7eea-19f9-4957-8cf2-7407a45137f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.346236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72ea7eea-19f9-4957-8cf2-7407a45137f2-scripts" (OuterVolumeSpecName: "scripts") pod "72ea7eea-19f9-4957-8cf2-7407a45137f2" (UID: "72ea7eea-19f9-4957-8cf2-7407a45137f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.349213 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adc582c0-f416-4991-89c7-9ddb850c0f2b-kube-api-access-xskm2" (OuterVolumeSpecName: "kube-api-access-xskm2") pod "adc582c0-f416-4991-89c7-9ddb850c0f2b" (UID: "adc582c0-f416-4991-89c7-9ddb850c0f2b"). InnerVolumeSpecName "kube-api-access-xskm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.349348 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "adc582c0-f416-4991-89c7-9ddb850c0f2b" (UID: "adc582c0-f416-4991-89c7-9ddb850c0f2b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.349354 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ea7eea-19f9-4957-8cf2-7407a45137f2-kube-api-access-z28rx" (OuterVolumeSpecName: "kube-api-access-z28rx") pod "72ea7eea-19f9-4957-8cf2-7407a45137f2" (UID: "72ea7eea-19f9-4957-8cf2-7407a45137f2"). InnerVolumeSpecName "kube-api-access-z28rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.349641 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ea7eea-19f9-4957-8cf2-7407a45137f2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "72ea7eea-19f9-4957-8cf2-7407a45137f2" (UID: "72ea7eea-19f9-4957-8cf2-7407a45137f2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.366811 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adc582c0-f416-4991-89c7-9ddb850c0f2b" (UID: "adc582c0-f416-4991-89c7-9ddb850c0f2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.390716 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-config-data" (OuterVolumeSpecName: "config-data") pod "adc582c0-f416-4991-89c7-9ddb850c0f2b" (UID: "adc582c0-f416-4991-89c7-9ddb850c0f2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.447314 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72ea7eea-19f9-4957-8cf2-7407a45137f2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.447353 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z28rx\" (UniqueName: \"kubernetes.io/projected/72ea7eea-19f9-4957-8cf2-7407a45137f2-kube-api-access-z28rx\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.447367 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.447392 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72ea7eea-19f9-4957-8cf2-7407a45137f2-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.447402 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xskm2\" (UniqueName: \"kubernetes.io/projected/adc582c0-f416-4991-89c7-9ddb850c0f2b-kube-api-access-xskm2\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.447410 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72ea7eea-19f9-4957-8cf2-7407a45137f2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.447418 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.447427 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adc582c0-f416-4991-89c7-9ddb850c0f2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.447435 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72ea7eea-19f9-4957-8cf2-7407a45137f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.661699 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95f45b77f-k4bfr" event={"ID":"9019e60a-b910-45e2-8c8c-62a0c8982cc4","Type":"ContainerDied","Data":"2c141e8a1c62865531b03081d7709e2c3b4d576ee4b1704e44330d9add356de7"} Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.661717 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95f45b77f-k4bfr" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.665524 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-klrf6" event={"ID":"adc582c0-f416-4991-89c7-9ddb850c0f2b","Type":"ContainerDied","Data":"168320045af00982f9a50f42a241a3da2e2adfa969a45db72869897781097eb6"} Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.665581 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="168320045af00982f9a50f42a241a3da2e2adfa969a45db72869897781097eb6" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.665547 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-klrf6" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.667344 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-857f5dbd69-s9wj7" event={"ID":"96075e3c-fec4-4453-89a3-ad47336d199b","Type":"ContainerDied","Data":"e2c88bd93fb75c283abbe79be2f7df8507206961fc68ac4c3ae3c626ee1eb3c3"} Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.667442 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-857f5dbd69-s9wj7" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.669406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2jc7b" event={"ID":"dd895e6c-1154-42f9-8490-b3832a9f815e","Type":"ContainerDied","Data":"63952f0baeb758d9fbdbdea744b6c891b5afbd8efa3ca1712d309ca30640511c"} Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.669454 4764 scope.go:117] "RemoveContainer" containerID="b5f0eb3eac9afc6384858b2bbbbaf2ec68d8c0a38d04117b016885c0de25ad70" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.669574 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2jc7b" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.672465 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5557c86df7-xp2rs" event={"ID":"72ea7eea-19f9-4957-8cf2-7407a45137f2","Type":"ContainerDied","Data":"04ef0cc5b3b231c923ecac2c2da4ea37cf4a7644d1924a3c6c08d4347ed41ee5"} Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.672570 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5557c86df7-xp2rs" Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.759291 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-95f45b77f-k4bfr"] Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.774738 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-95f45b77f-k4bfr"] Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.787808 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2jc7b"] Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.797976 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2jc7b"] Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.813644 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5557c86df7-xp2rs"] Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.818966 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5557c86df7-xp2rs"] Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.829348 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-857f5dbd69-s9wj7"] Mar 20 15:11:44 crc kubenswrapper[4764]: I0320 15:11:44.834471 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-857f5dbd69-s9wj7"] Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.136312 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ea7eea-19f9-4957-8cf2-7407a45137f2" path="/var/lib/kubelet/pods/72ea7eea-19f9-4957-8cf2-7407a45137f2/volumes" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.136778 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9019e60a-b910-45e2-8c8c-62a0c8982cc4" path="/var/lib/kubelet/pods/9019e60a-b910-45e2-8c8c-62a0c8982cc4/volumes" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.137173 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96075e3c-fec4-4453-89a3-ad47336d199b" path="/var/lib/kubelet/pods/96075e3c-fec4-4453-89a3-ad47336d199b/volumes" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.137510 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd895e6c-1154-42f9-8490-b3832a9f815e" path="/var/lib/kubelet/pods/dd895e6c-1154-42f9-8490-b3832a9f815e/volumes" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.573220 4764 scope.go:117] "RemoveContainer" containerID="c1bdff07276dc312b4ea36ad411d5b6db0e534267f9da853550ca49e6e0cd673" Mar 20 15:11:45 crc kubenswrapper[4764]: E0320 15:11:45.576580 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 15:11:45 crc kubenswrapper[4764]: E0320 15:11:45.576768 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f5rpc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6zj6m_openstack(337e2278-00e7-428e-97c1-c8d940d83aa4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:11:45 crc kubenswrapper[4764]: E0320 15:11:45.578293 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6zj6m" podUID="337e2278-00e7-428e-97c1-c8d940d83aa4" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.758497 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66899c9d8-zh5gp" event={"ID":"e532b989-f73c-49a1-b4f2-43322246a71e","Type":"ContainerStarted","Data":"a05d6c3300d06509b07f48e78be82b6babdff234c8e065151d6eeeed4f235ad7"} Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.771254 4764 generic.go:334] "Generic (PLEG): container finished" podID="8d3ec055-05fb-42c0-bb97-342be3f1e32d" containerID="8eff8c0ac05e54673fbf6c7e2b62165935fca4bb2a85b981663fe393db72cd8f" exitCode=0 Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.772151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5v7z" event={"ID":"8d3ec055-05fb-42c0-bb97-342be3f1e32d","Type":"ContainerDied","Data":"8eff8c0ac05e54673fbf6c7e2b62165935fca4bb2a85b981663fe393db72cd8f"} Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.772540 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-wtbjq"] Mar 20 15:11:45 crc kubenswrapper[4764]: E0320 15:11:45.772840 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd895e6c-1154-42f9-8490-b3832a9f815e" containerName="init" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.772852 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd895e6c-1154-42f9-8490-b3832a9f815e" containerName="init" Mar 20 15:11:45 crc kubenswrapper[4764]: E0320 15:11:45.772860 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adc582c0-f416-4991-89c7-9ddb850c0f2b" containerName="glance-db-sync" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.772866 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="adc582c0-f416-4991-89c7-9ddb850c0f2b" containerName="glance-db-sync" Mar 20 15:11:45 crc kubenswrapper[4764]: E0320 15:11:45.772885 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd895e6c-1154-42f9-8490-b3832a9f815e" containerName="dnsmasq-dns" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.772891 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd895e6c-1154-42f9-8490-b3832a9f815e" containerName="dnsmasq-dns" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.773699 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd895e6c-1154-42f9-8490-b3832a9f815e" containerName="dnsmasq-dns" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.773721 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="adc582c0-f416-4991-89c7-9ddb850c0f2b" containerName="glance-db-sync" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.774489 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:45 crc kubenswrapper[4764]: E0320 15:11:45.776830 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-6zj6m" podUID="337e2278-00e7-428e-97c1-c8d940d83aa4" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.799372 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-wtbjq"] Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.970073 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rhrf\" (UniqueName: \"kubernetes.io/projected/44e6108c-2c31-43e0-b8af-615a426b4887-kube-api-access-8rhrf\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.970168 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.970223 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.970258 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.970285 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:45 crc kubenswrapper[4764]: I0320 15:11:45.970346 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-config\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.074370 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-config\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.074471 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rhrf\" (UniqueName: \"kubernetes.io/projected/44e6108c-2c31-43e0-b8af-615a426b4887-kube-api-access-8rhrf\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.074492 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.074523 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.074556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.074574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.075719 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.076832 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-config\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.077301 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.079802 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.080431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.106514 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rhrf\" (UniqueName: \"kubernetes.io/projected/44e6108c-2c31-43e0-b8af-615a426b4887-kube-api-access-8rhrf\") pod \"dnsmasq-dns-785d8bcb8c-wtbjq\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.133963 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-655785589d-5cnb4"] Mar 20 15:11:46 crc kubenswrapper[4764]: W0320 15:11:46.140082 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb148fab_0227_4725_af4e_d6dba5740303.slice/crio-c17495151d7332cd46fdb5ec7383dead63cfda8972e42a4cf7258d7078bdbae3 WatchSource:0}: Error finding container c17495151d7332cd46fdb5ec7383dead63cfda8972e42a4cf7258d7078bdbae3: Status 404 returned error can't find the container with id c17495151d7332cd46fdb5ec7383dead63cfda8972e42a4cf7258d7078bdbae3 Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.210580 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.250637 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8mz56"] Mar 20 15:11:46 crc kubenswrapper[4764]: W0320 15:11:46.258559 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a8a0840_2a17_42d6_94e5_19653a16ff80.slice/crio-78509d1ef7b84cd16cbb27dc460b5cd9275eebd59a50a39f8ff649733a4e651b WatchSource:0}: Error finding container 78509d1ef7b84cd16cbb27dc460b5cd9275eebd59a50a39f8ff649733a4e651b: Status 404 returned error can't find the container with id 78509d1ef7b84cd16cbb27dc460b5cd9275eebd59a50a39f8ff649733a4e651b Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.651121 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-wtbjq"] Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.702845 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.704292 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.707632 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qbgl2" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.707808 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.707948 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.715123 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.780104 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66899c9d8-zh5gp" event={"ID":"e532b989-f73c-49a1-b4f2-43322246a71e","Type":"ContainerStarted","Data":"c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12"} Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.780311 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66899c9d8-zh5gp" event={"ID":"e532b989-f73c-49a1-b4f2-43322246a71e","Type":"ContainerStarted","Data":"8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41"} Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.782658 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xhldv" event={"ID":"df97d5e8-2808-4bef-9fad-b54c27554d23","Type":"ContainerStarted","Data":"936b27acc46d1ccbb24af8fbf737a64c6218c1631d1a86cdb9edb1a0a9c2e08e"} Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.788664 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-655785589d-5cnb4" event={"ID":"cb148fab-0227-4725-af4e-d6dba5740303","Type":"ContainerStarted","Data":"3427f699f490fe37291115840feba96f8ad65dee0b615e9b03ad94a9d96e7e8a"} Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.788772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-655785589d-5cnb4" event={"ID":"cb148fab-0227-4725-af4e-d6dba5740303","Type":"ContainerStarted","Data":"9f7c414725a91ab9a8882d546d6bb14b5fe88c8c539aae89290d229122d46740"} Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.788835 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-655785589d-5cnb4" event={"ID":"cb148fab-0227-4725-af4e-d6dba5740303","Type":"ContainerStarted","Data":"c17495151d7332cd46fdb5ec7383dead63cfda8972e42a4cf7258d7078bdbae3"} Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.790791 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-scripts\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.790887 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpr28\" (UniqueName: \"kubernetes.io/projected/85fe2438-0a1e-4516-9da9-69b51291b035-kube-api-access-gpr28\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.790961 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85fe2438-0a1e-4516-9da9-69b51291b035-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.791069 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.791151 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85fe2438-0a1e-4516-9da9-69b51291b035-logs\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.791213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.791279 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-config-data\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.791966 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q4nl4" event={"ID":"329bd08e-9bf1-4c6e-b234-e99022daa848","Type":"ContainerStarted","Data":"e5563bf00571bb7280679430090642c4974f87cc237824756668b0d90105aacb"} Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.795140 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8mz56" event={"ID":"7a8a0840-2a17-42d6-94e5-19653a16ff80","Type":"ContainerStarted","Data":"5da5d2fb8d72b9c9b4b323b0b0b58b216e7e9aefbfeb0d6b2465effdb30193d4"} Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.795186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8mz56" event={"ID":"7a8a0840-2a17-42d6-94e5-19653a16ff80","Type":"ContainerStarted","Data":"78509d1ef7b84cd16cbb27dc460b5cd9275eebd59a50a39f8ff649733a4e651b"} Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.801243 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-66899c9d8-zh5gp" podStartSLOduration=22.336205997 podStartE2EDuration="22.801223474s" podCreationTimestamp="2026-03-20 15:11:24 +0000 UTC" firstStartedPulling="2026-03-20 15:11:45.535478895 +0000 UTC m=+1227.151668044" lastFinishedPulling="2026-03-20 15:11:46.000496392 +0000 UTC m=+1227.616685521" observedRunningTime="2026-03-20 15:11:46.801024498 +0000 UTC m=+1228.417213627" watchObservedRunningTime="2026-03-20 15:11:46.801223474 +0000 UTC m=+1228.417412603" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.823195 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8mz56" podStartSLOduration=17.823180733 podStartE2EDuration="17.823180733s" podCreationTimestamp="2026-03-20 15:11:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:46.822637986 +0000 UTC m=+1228.438827115" watchObservedRunningTime="2026-03-20 15:11:46.823180733 +0000 UTC m=+1228.439369862" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.839142 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-q4nl4" podStartSLOduration=4.687062581 podStartE2EDuration="31.839128086s" podCreationTimestamp="2026-03-20 15:11:15 +0000 UTC" firstStartedPulling="2026-03-20 15:11:16.854801309 +0000 UTC m=+1198.470990438" lastFinishedPulling="2026-03-20 15:11:44.006866824 +0000 UTC m=+1225.623055943" observedRunningTime="2026-03-20 15:11:46.837788905 +0000 UTC m=+1228.453978034" watchObservedRunningTime="2026-03-20 15:11:46.839128086 +0000 UTC m=+1228.455317215" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.857818 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-xhldv" podStartSLOduration=5.040409853 podStartE2EDuration="31.857800934s" podCreationTimestamp="2026-03-20 15:11:15 +0000 UTC" firstStartedPulling="2026-03-20 15:11:17.186595954 +0000 UTC m=+1198.802785073" lastFinishedPulling="2026-03-20 15:11:44.003987025 +0000 UTC m=+1225.620176154" observedRunningTime="2026-03-20 15:11:46.853998247 +0000 UTC m=+1228.470187396" watchObservedRunningTime="2026-03-20 15:11:46.857800934 +0000 UTC m=+1228.473990063" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.869862 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.871344 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.874763 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.886186 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.887868 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-655785589d-5cnb4" podStartSLOduration=22.887850124 podStartE2EDuration="22.887850124s" podCreationTimestamp="2026-03-20 15:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:46.875117399 +0000 UTC m=+1228.491306528" watchObservedRunningTime="2026-03-20 15:11:46.887850124 +0000 UTC m=+1228.504039243" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.895843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-scripts\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.895912 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpr28\" (UniqueName: \"kubernetes.io/projected/85fe2438-0a1e-4516-9da9-69b51291b035-kube-api-access-gpr28\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.895979 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85fe2438-0a1e-4516-9da9-69b51291b035-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.896191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.896307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85fe2438-0a1e-4516-9da9-69b51291b035-logs\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.896354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.896405 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-config-data\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.897944 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85fe2438-0a1e-4516-9da9-69b51291b035-logs\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.898674 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.907712 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.908931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85fe2438-0a1e-4516-9da9-69b51291b035-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.923489 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-config-data\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.927925 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-scripts\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.944251 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.945927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpr28\" (UniqueName: \"kubernetes.io/projected/85fe2438-0a1e-4516-9da9-69b51291b035-kube-api-access-gpr28\") pod \"glance-default-external-api-0\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.998261 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.998348 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp2tx\" (UniqueName: \"kubernetes.io/projected/09483d11-68f6-41a5-928d-dacd558d1d0d-kube-api-access-dp2tx\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.998398 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.998416 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09483d11-68f6-41a5-928d-dacd558d1d0d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.998446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.998480 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09483d11-68f6-41a5-928d-dacd558d1d0d-logs\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:46 crc kubenswrapper[4764]: I0320 15:11:46.998509 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.036662 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.095571 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-2jc7b" podUID="dd895e6c-1154-42f9-8490-b3832a9f815e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.116:5353: i/o timeout" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.101352 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.101468 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.101518 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp2tx\" (UniqueName: \"kubernetes.io/projected/09483d11-68f6-41a5-928d-dacd558d1d0d-kube-api-access-dp2tx\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.101547 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.101568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09483d11-68f6-41a5-928d-dacd558d1d0d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.101598 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.101617 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09483d11-68f6-41a5-928d-dacd558d1d0d-logs\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.101746 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.105476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09483d11-68f6-41a5-928d-dacd558d1d0d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.107932 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09483d11-68f6-41a5-928d-dacd558d1d0d-logs\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.111709 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.114796 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.115321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.119055 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp2tx\" (UniqueName: \"kubernetes.io/projected/09483d11-68f6-41a5-928d-dacd558d1d0d-kube-api-access-dp2tx\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.155934 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.180759 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.299829 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.304510 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d3ec055-05fb-42c0-bb97-342be3f1e32d-combined-ca-bundle\") pod \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\" (UID: \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\") " Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.304572 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmd9s\" (UniqueName: \"kubernetes.io/projected/8d3ec055-05fb-42c0-bb97-342be3f1e32d-kube-api-access-xmd9s\") pod \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\" (UID: \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\") " Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.304643 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d3ec055-05fb-42c0-bb97-342be3f1e32d-config\") pod \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\" (UID: \"8d3ec055-05fb-42c0-bb97-342be3f1e32d\") " Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.309883 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d3ec055-05fb-42c0-bb97-342be3f1e32d-kube-api-access-xmd9s" (OuterVolumeSpecName: "kube-api-access-xmd9s") pod "8d3ec055-05fb-42c0-bb97-342be3f1e32d" (UID: "8d3ec055-05fb-42c0-bb97-342be3f1e32d"). InnerVolumeSpecName "kube-api-access-xmd9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.357571 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d3ec055-05fb-42c0-bb97-342be3f1e32d-config" (OuterVolumeSpecName: "config") pod "8d3ec055-05fb-42c0-bb97-342be3f1e32d" (UID: "8d3ec055-05fb-42c0-bb97-342be3f1e32d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.359921 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d3ec055-05fb-42c0-bb97-342be3f1e32d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d3ec055-05fb-42c0-bb97-342be3f1e32d" (UID: "8d3ec055-05fb-42c0-bb97-342be3f1e32d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.407062 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d3ec055-05fb-42c0-bb97-342be3f1e32d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.407334 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmd9s\" (UniqueName: \"kubernetes.io/projected/8d3ec055-05fb-42c0-bb97-342be3f1e32d-kube-api-access-xmd9s\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.407346 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d3ec055-05fb-42c0-bb97-342be3f1e32d-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.656815 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:11:47 crc kubenswrapper[4764]: W0320 15:11:47.658979 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85fe2438_0a1e_4516_9da9_69b51291b035.slice/crio-1d1853e041f96fb399e009867ab1e4de0b6a86ffa8066ece7bd54ed876d29e39 WatchSource:0}: Error finding container 1d1853e041f96fb399e009867ab1e4de0b6a86ffa8066ece7bd54ed876d29e39: Status 404 returned error can't find the container with id 1d1853e041f96fb399e009867ab1e4de0b6a86ffa8066ece7bd54ed876d29e39 Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.803678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85fe2438-0a1e-4516-9da9-69b51291b035","Type":"ContainerStarted","Data":"1d1853e041f96fb399e009867ab1e4de0b6a86ffa8066ece7bd54ed876d29e39"} Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.806692 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5v7z" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.806714 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5v7z" event={"ID":"8d3ec055-05fb-42c0-bb97-342be3f1e32d","Type":"ContainerDied","Data":"06b00029b4fc0c4b54f8c6503dd323304471901ef0b1cccbc52e475be9f9ca7c"} Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.807682 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06b00029b4fc0c4b54f8c6503dd323304471901ef0b1cccbc52e475be9f9ca7c" Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.808475 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2","Type":"ContainerStarted","Data":"6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897"} Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.809983 4764 generic.go:334] "Generic (PLEG): container finished" podID="44e6108c-2c31-43e0-b8af-615a426b4887" containerID="f536dab78e8b9243b29f97e7ed2525c9e96c015579c5fb2848a2f13946030b2e" exitCode=0 Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.810548 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" event={"ID":"44e6108c-2c31-43e0-b8af-615a426b4887","Type":"ContainerDied","Data":"f536dab78e8b9243b29f97e7ed2525c9e96c015579c5fb2848a2f13946030b2e"} Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.810604 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" event={"ID":"44e6108c-2c31-43e0-b8af-615a426b4887","Type":"ContainerStarted","Data":"aedd915bee65c5a9dbc12f33041602f82b63ca5dfb1ef96f2c111d01d8851ccc"} Mar 20 15:11:47 crc kubenswrapper[4764]: I0320 15:11:47.896601 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.017749 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-wtbjq"] Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.058448 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-k7b98"] Mar 20 15:11:48 crc kubenswrapper[4764]: E0320 15:11:48.058903 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d3ec055-05fb-42c0-bb97-342be3f1e32d" containerName="neutron-db-sync" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.058928 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3ec055-05fb-42c0-bb97-342be3f1e32d" containerName="neutron-db-sync" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.061149 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d3ec055-05fb-42c0-bb97-342be3f1e32d" containerName="neutron-db-sync" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.062250 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.073132 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-k7b98"] Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.134749 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.134848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.134882 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.134993 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hngmv\" (UniqueName: \"kubernetes.io/projected/51124ecf-d50f-478d-9a87-f81f7e72571e-kube-api-access-hngmv\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.135045 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-config\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.135108 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.186008 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b6dc455f6-dhtvx"] Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.187395 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.191960 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.192214 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cn5fs" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.192412 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.192546 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.225798 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b6dc455f6-dhtvx"] Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.236792 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-config\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.236886 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.236963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.237004 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.237028 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.237084 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hngmv\" (UniqueName: \"kubernetes.io/projected/51124ecf-d50f-478d-9a87-f81f7e72571e-kube-api-access-hngmv\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.238317 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-config\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.238882 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.239445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.239749 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.243806 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.265016 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hngmv\" (UniqueName: \"kubernetes.io/projected/51124ecf-d50f-478d-9a87-f81f7e72571e-kube-api-access-hngmv\") pod \"dnsmasq-dns-55f844cf75-k7b98\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.340845 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-config\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.340923 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-ovndb-tls-certs\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.340941 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-httpd-config\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.340968 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54d59\" (UniqueName: \"kubernetes.io/projected/0a8f151d-8a23-42d9-90a1-65caade3b03e-kube-api-access-54d59\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.341029 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-combined-ca-bundle\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.412587 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.442211 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-ovndb-tls-certs\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.442251 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-httpd-config\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.442273 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54d59\" (UniqueName: \"kubernetes.io/projected/0a8f151d-8a23-42d9-90a1-65caade3b03e-kube-api-access-54d59\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.442335 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-combined-ca-bundle\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.442396 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-config\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.449371 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-config\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.450286 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-ovndb-tls-certs\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.450793 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-combined-ca-bundle\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.452032 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-httpd-config\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.464056 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54d59\" (UniqueName: \"kubernetes.io/projected/0a8f151d-8a23-42d9-90a1-65caade3b03e-kube-api-access-54d59\") pod \"neutron-5b6dc455f6-dhtvx\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.526877 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.775485 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.882958 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.965490 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-k7b98"] Mar 20 15:11:48 crc kubenswrapper[4764]: I0320 15:11:48.982272 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"09483d11-68f6-41a5-928d-dacd558d1d0d","Type":"ContainerStarted","Data":"1d8f432924373d8d1c50d2cbbbb708603d980808d3db1e26fc0b2497d03a13a1"} Mar 20 15:11:49 crc kubenswrapper[4764]: I0320 15:11:49.067283 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" event={"ID":"44e6108c-2c31-43e0-b8af-615a426b4887","Type":"ContainerStarted","Data":"777a0d6fc3bb094d71e5ff4d0e7245eaf3f206f725f13ca3d34126e2b3c664c6"} Mar 20 15:11:49 crc kubenswrapper[4764]: I0320 15:11:49.067453 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" podUID="44e6108c-2c31-43e0-b8af-615a426b4887" containerName="dnsmasq-dns" containerID="cri-o://777a0d6fc3bb094d71e5ff4d0e7245eaf3f206f725f13ca3d34126e2b3c664c6" gracePeriod=10 Mar 20 15:11:49 crc kubenswrapper[4764]: I0320 15:11:49.068078 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:49 crc kubenswrapper[4764]: I0320 15:11:49.102865 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85fe2438-0a1e-4516-9da9-69b51291b035","Type":"ContainerStarted","Data":"a12ee63a1455e376068b35244c328013d7a5bc1a21968702054e287427b8cafd"} Mar 20 15:11:49 crc kubenswrapper[4764]: I0320 15:11:49.117370 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" podStartSLOduration=4.117351138 podStartE2EDuration="4.117351138s" podCreationTimestamp="2026-03-20 15:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:49.107083651 +0000 UTC m=+1230.723272780" watchObservedRunningTime="2026-03-20 15:11:49.117351138 +0000 UTC m=+1230.733540277" Mar 20 15:11:49 crc kubenswrapper[4764]: I0320 15:11:49.362524 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b6dc455f6-dhtvx"] Mar 20 15:11:49 crc kubenswrapper[4764]: W0320 15:11:49.761449 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a8f151d_8a23_42d9_90a1_65caade3b03e.slice/crio-fe7bb9439d51b2652644d0e4848439d8aff1dcf583a55a74a6421964363d7416 WatchSource:0}: Error finding container fe7bb9439d51b2652644d0e4848439d8aff1dcf583a55a74a6421964363d7416: Status 404 returned error can't find the container with id fe7bb9439d51b2652644d0e4848439d8aff1dcf583a55a74a6421964363d7416 Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.117396 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85fe2438-0a1e-4516-9da9-69b51291b035","Type":"ContainerStarted","Data":"4210bda3d4892f4e9bf22d63b64747f9bc679b8c3cfe3f62c352729942fa7923"} Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.126294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"09483d11-68f6-41a5-928d-dacd558d1d0d","Type":"ContainerStarted","Data":"faab6afefbd81ed2e90a84036872532db8b8b95fa6eeac3299ef37725d4d0b43"} Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.136002 4764 generic.go:334] "Generic (PLEG): container finished" podID="44e6108c-2c31-43e0-b8af-615a426b4887" containerID="777a0d6fc3bb094d71e5ff4d0e7245eaf3f206f725f13ca3d34126e2b3c664c6" exitCode=0 Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.136084 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" event={"ID":"44e6108c-2c31-43e0-b8af-615a426b4887","Type":"ContainerDied","Data":"777a0d6fc3bb094d71e5ff4d0e7245eaf3f206f725f13ca3d34126e2b3c664c6"} Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.139476 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-k7b98" event={"ID":"51124ecf-d50f-478d-9a87-f81f7e72571e","Type":"ContainerStarted","Data":"6b7515d1aad8ed273484631b9ab4ece7f4f480327fc336977b743630c7c9a158"} Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.142955 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b6dc455f6-dhtvx" event={"ID":"0a8f151d-8a23-42d9-90a1-65caade3b03e","Type":"ContainerStarted","Data":"fe7bb9439d51b2652644d0e4848439d8aff1dcf583a55a74a6421964363d7416"} Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.245425 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.316761 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rhrf\" (UniqueName: \"kubernetes.io/projected/44e6108c-2c31-43e0-b8af-615a426b4887-kube-api-access-8rhrf\") pod \"44e6108c-2c31-43e0-b8af-615a426b4887\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.317069 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-ovsdbserver-nb\") pod \"44e6108c-2c31-43e0-b8af-615a426b4887\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.317097 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-dns-svc\") pod \"44e6108c-2c31-43e0-b8af-615a426b4887\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.317135 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-ovsdbserver-sb\") pod \"44e6108c-2c31-43e0-b8af-615a426b4887\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.317164 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-dns-swift-storage-0\") pod \"44e6108c-2c31-43e0-b8af-615a426b4887\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.317209 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-config\") pod \"44e6108c-2c31-43e0-b8af-615a426b4887\" (UID: \"44e6108c-2c31-43e0-b8af-615a426b4887\") " Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.330282 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e6108c-2c31-43e0-b8af-615a426b4887-kube-api-access-8rhrf" (OuterVolumeSpecName: "kube-api-access-8rhrf") pod "44e6108c-2c31-43e0-b8af-615a426b4887" (UID: "44e6108c-2c31-43e0-b8af-615a426b4887"). InnerVolumeSpecName "kube-api-access-8rhrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.373793 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "44e6108c-2c31-43e0-b8af-615a426b4887" (UID: "44e6108c-2c31-43e0-b8af-615a426b4887"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.383055 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-config" (OuterVolumeSpecName: "config") pod "44e6108c-2c31-43e0-b8af-615a426b4887" (UID: "44e6108c-2c31-43e0-b8af-615a426b4887"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.383768 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "44e6108c-2c31-43e0-b8af-615a426b4887" (UID: "44e6108c-2c31-43e0-b8af-615a426b4887"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.384846 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "44e6108c-2c31-43e0-b8af-615a426b4887" (UID: "44e6108c-2c31-43e0-b8af-615a426b4887"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.407896 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "44e6108c-2c31-43e0-b8af-615a426b4887" (UID: "44e6108c-2c31-43e0-b8af-615a426b4887"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.418696 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.418873 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.418933 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.418989 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.419042 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e6108c-2c31-43e0-b8af-615a426b4887-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:50 crc kubenswrapper[4764]: I0320 15:11:50.419125 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rhrf\" (UniqueName: \"kubernetes.io/projected/44e6108c-2c31-43e0-b8af-615a426b4887-kube-api-access-8rhrf\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.161831 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" event={"ID":"44e6108c-2c31-43e0-b8af-615a426b4887","Type":"ContainerDied","Data":"aedd915bee65c5a9dbc12f33041602f82b63ca5dfb1ef96f2c111d01d8851ccc"} Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.162136 4764 scope.go:117] "RemoveContainer" containerID="777a0d6fc3bb094d71e5ff4d0e7245eaf3f206f725f13ca3d34126e2b3c664c6" Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.162276 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-wtbjq" Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.168583 4764 generic.go:334] "Generic (PLEG): container finished" podID="51124ecf-d50f-478d-9a87-f81f7e72571e" containerID="4dbfdccd835db6fc255f853553a30417cb5aa92d0d331b3563aafe0f6022a66f" exitCode=0 Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.168642 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-k7b98" event={"ID":"51124ecf-d50f-478d-9a87-f81f7e72571e","Type":"ContainerDied","Data":"4dbfdccd835db6fc255f853553a30417cb5aa92d0d331b3563aafe0f6022a66f"} Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.175020 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b6dc455f6-dhtvx" event={"ID":"0a8f151d-8a23-42d9-90a1-65caade3b03e","Type":"ContainerStarted","Data":"934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a"} Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.175067 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b6dc455f6-dhtvx" event={"ID":"0a8f151d-8a23-42d9-90a1-65caade3b03e","Type":"ContainerStarted","Data":"87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283"} Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.175822 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.177087 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="85fe2438-0a1e-4516-9da9-69b51291b035" containerName="glance-log" containerID="cri-o://a12ee63a1455e376068b35244c328013d7a5bc1a21968702054e287427b8cafd" gracePeriod=30 Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.177359 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="09483d11-68f6-41a5-928d-dacd558d1d0d" containerName="glance-log" containerID="cri-o://faab6afefbd81ed2e90a84036872532db8b8b95fa6eeac3299ef37725d4d0b43" gracePeriod=30 Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.177436 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="85fe2438-0a1e-4516-9da9-69b51291b035" containerName="glance-httpd" containerID="cri-o://4210bda3d4892f4e9bf22d63b64747f9bc679b8c3cfe3f62c352729942fa7923" gracePeriod=30 Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.177460 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="09483d11-68f6-41a5-928d-dacd558d1d0d" containerName="glance-httpd" containerID="cri-o://5dfca99798dbfee01df7dff1c8e661dbdd011cae247e8fa49cd3690234374f92" gracePeriod=30 Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.177523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"09483d11-68f6-41a5-928d-dacd558d1d0d","Type":"ContainerStarted","Data":"5dfca99798dbfee01df7dff1c8e661dbdd011cae247e8fa49cd3690234374f92"} Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.239003 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.238987156 podStartE2EDuration="6.238987156s" podCreationTimestamp="2026-03-20 15:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:51.210847305 +0000 UTC m=+1232.827036424" watchObservedRunningTime="2026-03-20 15:11:51.238987156 +0000 UTC m=+1232.855176285" Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.263005 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b6dc455f6-dhtvx" podStartSLOduration=3.262986159 podStartE2EDuration="3.262986159s" podCreationTimestamp="2026-03-20 15:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:51.238681117 +0000 UTC m=+1232.854870246" watchObservedRunningTime="2026-03-20 15:11:51.262986159 +0000 UTC m=+1232.879175288" Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.267755 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.267747196 podStartE2EDuration="6.267747196s" podCreationTimestamp="2026-03-20 15:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:51.262600487 +0000 UTC m=+1232.878789616" watchObservedRunningTime="2026-03-20 15:11:51.267747196 +0000 UTC m=+1232.883936325" Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.287467 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-wtbjq"] Mar 20 15:11:51 crc kubenswrapper[4764]: I0320 15:11:51.299554 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-wtbjq"] Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.127833 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fbdf79ddf-5llj7"] Mar 20 15:11:52 crc kubenswrapper[4764]: E0320 15:11:52.134097 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e6108c-2c31-43e0-b8af-615a426b4887" containerName="init" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.134348 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e6108c-2c31-43e0-b8af-615a426b4887" containerName="init" Mar 20 15:11:52 crc kubenswrapper[4764]: E0320 15:11:52.134437 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e6108c-2c31-43e0-b8af-615a426b4887" containerName="dnsmasq-dns" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.134497 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e6108c-2c31-43e0-b8af-615a426b4887" containerName="dnsmasq-dns" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.134727 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="44e6108c-2c31-43e0-b8af-615a426b4887" containerName="dnsmasq-dns" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.135790 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.139734 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.139911 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.146119 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fbdf79ddf-5llj7"] Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.218158 4764 generic.go:334] "Generic (PLEG): container finished" podID="df97d5e8-2808-4bef-9fad-b54c27554d23" containerID="936b27acc46d1ccbb24af8fbf737a64c6218c1631d1a86cdb9edb1a0a9c2e08e" exitCode=0 Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.218230 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xhldv" event={"ID":"df97d5e8-2808-4bef-9fad-b54c27554d23","Type":"ContainerDied","Data":"936b27acc46d1ccbb24af8fbf737a64c6218c1631d1a86cdb9edb1a0a9c2e08e"} Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.226234 4764 generic.go:334] "Generic (PLEG): container finished" podID="85fe2438-0a1e-4516-9da9-69b51291b035" containerID="4210bda3d4892f4e9bf22d63b64747f9bc679b8c3cfe3f62c352729942fa7923" exitCode=0 Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.226256 4764 generic.go:334] "Generic (PLEG): container finished" podID="85fe2438-0a1e-4516-9da9-69b51291b035" containerID="a12ee63a1455e376068b35244c328013d7a5bc1a21968702054e287427b8cafd" exitCode=143 Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.226287 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85fe2438-0a1e-4516-9da9-69b51291b035","Type":"ContainerDied","Data":"4210bda3d4892f4e9bf22d63b64747f9bc679b8c3cfe3f62c352729942fa7923"} Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.226303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85fe2438-0a1e-4516-9da9-69b51291b035","Type":"ContainerDied","Data":"a12ee63a1455e376068b35244c328013d7a5bc1a21968702054e287427b8cafd"} Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.238613 4764 generic.go:334] "Generic (PLEG): container finished" podID="09483d11-68f6-41a5-928d-dacd558d1d0d" containerID="5dfca99798dbfee01df7dff1c8e661dbdd011cae247e8fa49cd3690234374f92" exitCode=0 Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.238656 4764 generic.go:334] "Generic (PLEG): container finished" podID="09483d11-68f6-41a5-928d-dacd558d1d0d" containerID="faab6afefbd81ed2e90a84036872532db8b8b95fa6eeac3299ef37725d4d0b43" exitCode=143 Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.238735 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"09483d11-68f6-41a5-928d-dacd558d1d0d","Type":"ContainerDied","Data":"5dfca99798dbfee01df7dff1c8e661dbdd011cae247e8fa49cd3690234374f92"} Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.238761 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"09483d11-68f6-41a5-928d-dacd558d1d0d","Type":"ContainerDied","Data":"faab6afefbd81ed2e90a84036872532db8b8b95fa6eeac3299ef37725d4d0b43"} Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.245038 4764 generic.go:334] "Generic (PLEG): container finished" podID="329bd08e-9bf1-4c6e-b234-e99022daa848" containerID="e5563bf00571bb7280679430090642c4974f87cc237824756668b0d90105aacb" exitCode=0 Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.245096 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q4nl4" event={"ID":"329bd08e-9bf1-4c6e-b234-e99022daa848","Type":"ContainerDied","Data":"e5563bf00571bb7280679430090642c4974f87cc237824756668b0d90105aacb"} Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.247828 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a8a0840-2a17-42d6-94e5-19653a16ff80" containerID="5da5d2fb8d72b9c9b4b323b0b0b58b216e7e9aefbfeb0d6b2465effdb30193d4" exitCode=0 Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.248309 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8mz56" event={"ID":"7a8a0840-2a17-42d6-94e5-19653a16ff80","Type":"ContainerDied","Data":"5da5d2fb8d72b9c9b4b323b0b0b58b216e7e9aefbfeb0d6b2465effdb30193d4"} Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.266119 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-public-tls-certs\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.266165 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-httpd-config\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.266213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-ovndb-tls-certs\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.266276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzncg\" (UniqueName: \"kubernetes.io/projected/07deed84-d17d-4b5f-955d-7087ecbc782d-kube-api-access-zzncg\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.266332 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-combined-ca-bundle\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.266353 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-internal-tls-certs\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.266434 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-config\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.367577 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-ovndb-tls-certs\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.367673 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzncg\" (UniqueName: \"kubernetes.io/projected/07deed84-d17d-4b5f-955d-7087ecbc782d-kube-api-access-zzncg\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.367736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-combined-ca-bundle\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.367851 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-internal-tls-certs\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.368952 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-config\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.369006 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-public-tls-certs\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.369025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-httpd-config\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.374354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-internal-tls-certs\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.374686 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-httpd-config\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.389565 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-config\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.393349 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzncg\" (UniqueName: \"kubernetes.io/projected/07deed84-d17d-4b5f-955d-7087ecbc782d-kube-api-access-zzncg\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.394961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-public-tls-certs\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.395336 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-combined-ca-bundle\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.396578 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-ovndb-tls-certs\") pod \"neutron-fbdf79ddf-5llj7\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:52 crc kubenswrapper[4764]: I0320 15:11:52.495151 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:53 crc kubenswrapper[4764]: I0320 15:11:53.139253 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44e6108c-2c31-43e0-b8af-615a426b4887" path="/var/lib/kubelet/pods/44e6108c-2c31-43e0-b8af-615a426b4887/volumes" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.583600 4764 scope.go:117] "RemoveContainer" containerID="f536dab78e8b9243b29f97e7ed2525c9e96c015579c5fb2848a2f13946030b2e" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.603606 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.614972 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.627896 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.713081 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64dx5\" (UniqueName: \"kubernetes.io/projected/329bd08e-9bf1-4c6e-b234-e99022daa848-kube-api-access-64dx5\") pod \"329bd08e-9bf1-4c6e-b234-e99022daa848\" (UID: \"329bd08e-9bf1-4c6e-b234-e99022daa848\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.713321 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329bd08e-9bf1-4c6e-b234-e99022daa848-combined-ca-bundle\") pod \"329bd08e-9bf1-4c6e-b234-e99022daa848\" (UID: \"329bd08e-9bf1-4c6e-b234-e99022daa848\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.713350 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/329bd08e-9bf1-4c6e-b234-e99022daa848-db-sync-config-data\") pod \"329bd08e-9bf1-4c6e-b234-e99022daa848\" (UID: \"329bd08e-9bf1-4c6e-b234-e99022daa848\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.721055 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329bd08e-9bf1-4c6e-b234-e99022daa848-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "329bd08e-9bf1-4c6e-b234-e99022daa848" (UID: "329bd08e-9bf1-4c6e-b234-e99022daa848"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.729080 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329bd08e-9bf1-4c6e-b234-e99022daa848-kube-api-access-64dx5" (OuterVolumeSpecName: "kube-api-access-64dx5") pod "329bd08e-9bf1-4c6e-b234-e99022daa848" (UID: "329bd08e-9bf1-4c6e-b234-e99022daa848"). InnerVolumeSpecName "kube-api-access-64dx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.752518 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.753497 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.753688 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/329bd08e-9bf1-4c6e-b234-e99022daa848-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "329bd08e-9bf1-4c6e-b234-e99022daa848" (UID: "329bd08e-9bf1-4c6e-b234-e99022daa848"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.788432 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.789096 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.814692 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxlgz\" (UniqueName: \"kubernetes.io/projected/df97d5e8-2808-4bef-9fad-b54c27554d23-kube-api-access-mxlgz\") pod \"df97d5e8-2808-4bef-9fad-b54c27554d23\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.814742 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-scripts\") pod \"df97d5e8-2808-4bef-9fad-b54c27554d23\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.814770 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-config-data\") pod \"df97d5e8-2808-4bef-9fad-b54c27554d23\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.814794 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-config-data\") pod \"7a8a0840-2a17-42d6-94e5-19653a16ff80\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.814815 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df97d5e8-2808-4bef-9fad-b54c27554d23-logs\") pod \"df97d5e8-2808-4bef-9fad-b54c27554d23\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.814829 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-credential-keys\") pod \"7a8a0840-2a17-42d6-94e5-19653a16ff80\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.814938 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-combined-ca-bundle\") pod \"7a8a0840-2a17-42d6-94e5-19653a16ff80\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.815030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-combined-ca-bundle\") pod \"df97d5e8-2808-4bef-9fad-b54c27554d23\" (UID: \"df97d5e8-2808-4bef-9fad-b54c27554d23\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.815079 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-scripts\") pod \"7a8a0840-2a17-42d6-94e5-19653a16ff80\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.815524 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmrzw\" (UniqueName: \"kubernetes.io/projected/7a8a0840-2a17-42d6-94e5-19653a16ff80-kube-api-access-dmrzw\") pod \"7a8a0840-2a17-42d6-94e5-19653a16ff80\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.815552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-fernet-keys\") pod \"7a8a0840-2a17-42d6-94e5-19653a16ff80\" (UID: \"7a8a0840-2a17-42d6-94e5-19653a16ff80\") " Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.815873 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64dx5\" (UniqueName: \"kubernetes.io/projected/329bd08e-9bf1-4c6e-b234-e99022daa848-kube-api-access-64dx5\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.815888 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/329bd08e-9bf1-4c6e-b234-e99022daa848-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.815897 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/329bd08e-9bf1-4c6e-b234-e99022daa848-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.819175 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df97d5e8-2808-4bef-9fad-b54c27554d23-logs" (OuterVolumeSpecName: "logs") pod "df97d5e8-2808-4bef-9fad-b54c27554d23" (UID: "df97d5e8-2808-4bef-9fad-b54c27554d23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.824650 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-scripts" (OuterVolumeSpecName: "scripts") pod "7a8a0840-2a17-42d6-94e5-19653a16ff80" (UID: "7a8a0840-2a17-42d6-94e5-19653a16ff80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.827122 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df97d5e8-2808-4bef-9fad-b54c27554d23-kube-api-access-mxlgz" (OuterVolumeSpecName: "kube-api-access-mxlgz") pod "df97d5e8-2808-4bef-9fad-b54c27554d23" (UID: "df97d5e8-2808-4bef-9fad-b54c27554d23"). InnerVolumeSpecName "kube-api-access-mxlgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.831481 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-scripts" (OuterVolumeSpecName: "scripts") pod "df97d5e8-2808-4bef-9fad-b54c27554d23" (UID: "df97d5e8-2808-4bef-9fad-b54c27554d23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.830476 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7a8a0840-2a17-42d6-94e5-19653a16ff80" (UID: "7a8a0840-2a17-42d6-94e5-19653a16ff80"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.844983 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8a0840-2a17-42d6-94e5-19653a16ff80-kube-api-access-dmrzw" (OuterVolumeSpecName: "kube-api-access-dmrzw") pod "7a8a0840-2a17-42d6-94e5-19653a16ff80" (UID: "7a8a0840-2a17-42d6-94e5-19653a16ff80"). InnerVolumeSpecName "kube-api-access-dmrzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.845882 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7a8a0840-2a17-42d6-94e5-19653a16ff80" (UID: "7a8a0840-2a17-42d6-94e5-19653a16ff80"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.917142 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.917176 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxlgz\" (UniqueName: \"kubernetes.io/projected/df97d5e8-2808-4bef-9fad-b54c27554d23-kube-api-access-mxlgz\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.917187 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.917196 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df97d5e8-2808-4bef-9fad-b54c27554d23-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.917204 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.917211 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.917218 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmrzw\" (UniqueName: \"kubernetes.io/projected/7a8a0840-2a17-42d6-94e5-19653a16ff80-kube-api-access-dmrzw\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.939166 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-config-data" (OuterVolumeSpecName: "config-data") pod "7a8a0840-2a17-42d6-94e5-19653a16ff80" (UID: "7a8a0840-2a17-42d6-94e5-19653a16ff80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.971149 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.976005 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a8a0840-2a17-42d6-94e5-19653a16ff80" (UID: "7a8a0840-2a17-42d6-94e5-19653a16ff80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:54 crc kubenswrapper[4764]: I0320 15:11:54.983446 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-config-data" (OuterVolumeSpecName: "config-data") pod "df97d5e8-2808-4bef-9fad-b54c27554d23" (UID: "df97d5e8-2808-4bef-9fad-b54c27554d23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.006641 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df97d5e8-2808-4bef-9fad-b54c27554d23" (UID: "df97d5e8-2808-4bef-9fad-b54c27554d23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.018363 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.018407 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.018416 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df97d5e8-2808-4bef-9fad-b54c27554d23-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.018426 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8a0840-2a17-42d6-94e5-19653a16ff80-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.118996 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-combined-ca-bundle\") pod \"85fe2438-0a1e-4516-9da9-69b51291b035\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.119122 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-scripts\") pod \"85fe2438-0a1e-4516-9da9-69b51291b035\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.119160 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85fe2438-0a1e-4516-9da9-69b51291b035-httpd-run\") pod \"85fe2438-0a1e-4516-9da9-69b51291b035\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.119197 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85fe2438-0a1e-4516-9da9-69b51291b035-logs\") pod \"85fe2438-0a1e-4516-9da9-69b51291b035\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.119225 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-config-data\") pod \"85fe2438-0a1e-4516-9da9-69b51291b035\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.119258 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpr28\" (UniqueName: \"kubernetes.io/projected/85fe2438-0a1e-4516-9da9-69b51291b035-kube-api-access-gpr28\") pod \"85fe2438-0a1e-4516-9da9-69b51291b035\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.119279 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"85fe2438-0a1e-4516-9da9-69b51291b035\" (UID: \"85fe2438-0a1e-4516-9da9-69b51291b035\") " Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.119735 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85fe2438-0a1e-4516-9da9-69b51291b035-logs" (OuterVolumeSpecName: "logs") pod "85fe2438-0a1e-4516-9da9-69b51291b035" (UID: "85fe2438-0a1e-4516-9da9-69b51291b035"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.120287 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85fe2438-0a1e-4516-9da9-69b51291b035-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "85fe2438-0a1e-4516-9da9-69b51291b035" (UID: "85fe2438-0a1e-4516-9da9-69b51291b035"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.125888 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "85fe2438-0a1e-4516-9da9-69b51291b035" (UID: "85fe2438-0a1e-4516-9da9-69b51291b035"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.125909 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-scripts" (OuterVolumeSpecName: "scripts") pod "85fe2438-0a1e-4516-9da9-69b51291b035" (UID: "85fe2438-0a1e-4516-9da9-69b51291b035"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.129875 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85fe2438-0a1e-4516-9da9-69b51291b035-kube-api-access-gpr28" (OuterVolumeSpecName: "kube-api-access-gpr28") pod "85fe2438-0a1e-4516-9da9-69b51291b035" (UID: "85fe2438-0a1e-4516-9da9-69b51291b035"). InnerVolumeSpecName "kube-api-access-gpr28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.140934 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85fe2438-0a1e-4516-9da9-69b51291b035" (UID: "85fe2438-0a1e-4516-9da9-69b51291b035"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.165530 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-config-data" (OuterVolumeSpecName: "config-data") pod "85fe2438-0a1e-4516-9da9-69b51291b035" (UID: "85fe2438-0a1e-4516-9da9-69b51291b035"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.215544 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fbdf79ddf-5llj7"] Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.221151 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.221174 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.221183 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/85fe2438-0a1e-4516-9da9-69b51291b035-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.221192 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85fe2438-0a1e-4516-9da9-69b51291b035-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.221200 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85fe2438-0a1e-4516-9da9-69b51291b035-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.221211 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpr28\" (UniqueName: \"kubernetes.io/projected/85fe2438-0a1e-4516-9da9-69b51291b035-kube-api-access-gpr28\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.221243 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 15:11:55 crc kubenswrapper[4764]: W0320 15:11:55.246347 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07deed84_d17d_4b5f_955d_7087ecbc782d.slice/crio-8498500cf242d708c9226c9da6acbc01edab34bce7f4b1a0a7682557e14f851e WatchSource:0}: Error finding container 8498500cf242d708c9226c9da6acbc01edab34bce7f4b1a0a7682557e14f851e: Status 404 returned error can't find the container with id 8498500cf242d708c9226c9da6acbc01edab34bce7f4b1a0a7682557e14f851e Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.261091 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.270394 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-q4nl4" event={"ID":"329bd08e-9bf1-4c6e-b234-e99022daa848","Type":"ContainerDied","Data":"e3b00e845b20941ddd494dd29fb145f2838c3946f8129a8bb15162f807b8ba1a"} Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.270439 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3b00e845b20941ddd494dd29fb145f2838c3946f8129a8bb15162f807b8ba1a" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.270881 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-q4nl4" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.271510 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbdf79ddf-5llj7" event={"ID":"07deed84-d17d-4b5f-955d-7087ecbc782d","Type":"ContainerStarted","Data":"8498500cf242d708c9226c9da6acbc01edab34bce7f4b1a0a7682557e14f851e"} Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.275449 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8mz56" event={"ID":"7a8a0840-2a17-42d6-94e5-19653a16ff80","Type":"ContainerDied","Data":"78509d1ef7b84cd16cbb27dc460b5cd9275eebd59a50a39f8ff649733a4e651b"} Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.275501 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78509d1ef7b84cd16cbb27dc460b5cd9275eebd59a50a39f8ff649733a4e651b" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.275595 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8mz56" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.279528 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xhldv" event={"ID":"df97d5e8-2808-4bef-9fad-b54c27554d23","Type":"ContainerDied","Data":"62366ea3c949a7ec430f3ccb32643621d398bae0b8b306ee5fc07bcb7528ce26"} Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.279580 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xhldv" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.279584 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62366ea3c949a7ec430f3ccb32643621d398bae0b8b306ee5fc07bcb7528ce26" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.283318 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-k7b98" event={"ID":"51124ecf-d50f-478d-9a87-f81f7e72571e","Type":"ContainerStarted","Data":"97824812d78276c9a797e70ec83ab2d4bcf450904c75ded8cfdd35ce2642d66b"} Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.283415 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.286071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"85fe2438-0a1e-4516-9da9-69b51291b035","Type":"ContainerDied","Data":"1d1853e041f96fb399e009867ab1e4de0b6a86ffa8066ece7bd54ed876d29e39"} Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.286103 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.286110 4764 scope.go:117] "RemoveContainer" containerID="4210bda3d4892f4e9bf22d63b64747f9bc679b8c3cfe3f62c352729942fa7923" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.288373 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2","Type":"ContainerStarted","Data":"fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5"} Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.310815 4764 scope.go:117] "RemoveContainer" containerID="a12ee63a1455e376068b35244c328013d7a5bc1a21968702054e287427b8cafd" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.311122 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-k7b98" podStartSLOduration=7.311104143 podStartE2EDuration="7.311104143s" podCreationTimestamp="2026-03-20 15:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:55.307686088 +0000 UTC m=+1236.923875217" watchObservedRunningTime="2026-03-20 15:11:55.311104143 +0000 UTC m=+1236.927293272" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.327144 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.334510 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.343645 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.354894 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:11:55 crc kubenswrapper[4764]: E0320 15:11:55.355242 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8a0840-2a17-42d6-94e5-19653a16ff80" containerName="keystone-bootstrap" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.355260 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8a0840-2a17-42d6-94e5-19653a16ff80" containerName="keystone-bootstrap" Mar 20 15:11:55 crc kubenswrapper[4764]: E0320 15:11:55.355280 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85fe2438-0a1e-4516-9da9-69b51291b035" containerName="glance-httpd" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.355287 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85fe2438-0a1e-4516-9da9-69b51291b035" containerName="glance-httpd" Mar 20 15:11:55 crc kubenswrapper[4764]: E0320 15:11:55.355302 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85fe2438-0a1e-4516-9da9-69b51291b035" containerName="glance-log" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.355308 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85fe2438-0a1e-4516-9da9-69b51291b035" containerName="glance-log" Mar 20 15:11:55 crc kubenswrapper[4764]: E0320 15:11:55.355318 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df97d5e8-2808-4bef-9fad-b54c27554d23" containerName="placement-db-sync" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.355323 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df97d5e8-2808-4bef-9fad-b54c27554d23" containerName="placement-db-sync" Mar 20 15:11:55 crc kubenswrapper[4764]: E0320 15:11:55.355336 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329bd08e-9bf1-4c6e-b234-e99022daa848" containerName="barbican-db-sync" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.355342 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="329bd08e-9bf1-4c6e-b234-e99022daa848" containerName="barbican-db-sync" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.355522 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8a0840-2a17-42d6-94e5-19653a16ff80" containerName="keystone-bootstrap" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.355546 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85fe2438-0a1e-4516-9da9-69b51291b035" containerName="glance-log" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.355564 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="df97d5e8-2808-4bef-9fad-b54c27554d23" containerName="placement-db-sync" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.355579 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="329bd08e-9bf1-4c6e-b234-e99022daa848" containerName="barbican-db-sync" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.355599 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85fe2438-0a1e-4516-9da9-69b51291b035" containerName="glance-httpd" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.358137 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.363062 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.363280 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.365300 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.529615 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs242\" (UniqueName: \"kubernetes.io/projected/f03615c4-2b7b-4db8-8706-c47f8399c808-kube-api-access-rs242\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.529726 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f03615c4-2b7b-4db8-8706-c47f8399c808-logs\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.529756 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.529776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f03615c4-2b7b-4db8-8706-c47f8399c808-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.529795 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.529834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-scripts\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.529847 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.529866 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-config-data\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.631299 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f03615c4-2b7b-4db8-8706-c47f8399c808-logs\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.631354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.631411 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f03615c4-2b7b-4db8-8706-c47f8399c808-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.631443 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.631499 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-scripts\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.631522 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.631555 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-config-data\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.631582 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs242\" (UniqueName: \"kubernetes.io/projected/f03615c4-2b7b-4db8-8706-c47f8399c808-kube-api-access-rs242\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.632304 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.632420 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f03615c4-2b7b-4db8-8706-c47f8399c808-logs\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.632932 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f03615c4-2b7b-4db8-8706-c47f8399c808-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.636711 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-scripts\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.645545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.646960 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.652872 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-config-data\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.661137 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs242\" (UniqueName: \"kubernetes.io/projected/f03615c4-2b7b-4db8-8706-c47f8399c808-kube-api-access-rs242\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.691711 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.714424 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.786073 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6bdffd5796-4724r"] Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.791681 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.798186 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.798367 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.798481 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.808645 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.809368 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-clqvv" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.809886 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6bdffd5796-4724r"] Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.925128 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7c69f8c7f-rlffj"] Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.926513 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.938077 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-scripts\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.938136 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-public-tls-certs\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.938173 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-config-data\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.938194 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-combined-ca-bundle\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.938213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-internal-tls-certs\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.938265 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfxb8\" (UniqueName: \"kubernetes.io/projected/6ac36915-9830-4e2c-871c-d8b56e780587-kube-api-access-jfxb8\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.938290 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac36915-9830-4e2c-871c-d8b56e780587-logs\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.940858 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rhrt4" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.941046 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.941163 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.941265 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.946426 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-678fbbd7fd-2hzg8"] Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.947773 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.959290 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6586fcc5bc-m65hd"] Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.968278 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.968478 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.968601 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.968873 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-l8z6t" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.968997 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.969646 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.974706 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 15:11:55 crc kubenswrapper[4764]: I0320 15:11:55.991440 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-678fbbd7fd-2hzg8"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.009831 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7c69f8c7f-rlffj"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.019016 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6586fcc5bc-m65hd"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049296 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-config-data\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049562 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbkjn\" (UniqueName: \"kubernetes.io/projected/8d9fa689-47c1-464b-82a8-9f047a084ed7-kube-api-access-jbkjn\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-combined-ca-bundle\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049611 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-config-data\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049627 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-internal-tls-certs\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049657 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7tkd\" (UniqueName: \"kubernetes.io/projected/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-kube-api-access-k7tkd\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049685 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-scripts\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049709 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-credential-keys\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-internal-tls-certs\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049751 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfxb8\" (UniqueName: \"kubernetes.io/projected/6ac36915-9830-4e2c-871c-d8b56e780587-kube-api-access-jfxb8\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049776 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac36915-9830-4e2c-871c-d8b56e780587-logs\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049792 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-combined-ca-bundle\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049809 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-public-tls-certs\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049833 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-config-data\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049875 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-scripts\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049899 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-combined-ca-bundle\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049922 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-fernet-keys\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049937 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-config-data-custom\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049955 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-logs\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.049979 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-public-tls-certs\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.059926 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac36915-9830-4e2c-871c-d8b56e780587-logs\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.079753 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-scripts\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.082164 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-internal-tls-certs\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.082679 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-combined-ca-bundle\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.085576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-public-tls-certs\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.098098 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfxb8\" (UniqueName: \"kubernetes.io/projected/6ac36915-9830-4e2c-871c-d8b56e780587-kube-api-access-jfxb8\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.104158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-config-data\") pod \"placement-6bdffd5796-4724r\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.138987 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.152660 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153655 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-config-data-custom\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153677 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-logs\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153695 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-fernet-keys\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153737 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e6e687-6b83-45b0-b616-aeafd9f0faa4-logs\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153759 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbkjn\" (UniqueName: \"kubernetes.io/projected/8d9fa689-47c1-464b-82a8-9f047a084ed7-kube-api-access-jbkjn\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153779 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-config-data\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153803 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7tkd\" (UniqueName: \"kubernetes.io/projected/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-kube-api-access-k7tkd\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153824 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-config-data\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153839 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-scripts\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153861 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-config-data-custom\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153878 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-credential-keys\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-internal-tls-certs\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153913 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-combined-ca-bundle\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153933 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4llj\" (UniqueName: \"kubernetes.io/projected/96e6e687-6b83-45b0-b616-aeafd9f0faa4-kube-api-access-g4llj\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153958 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-combined-ca-bundle\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-public-tls-certs\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.153993 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-config-data\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.154026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-combined-ca-bundle\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.167053 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-config-data-custom\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.178895 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-logs\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.182310 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-fernet-keys\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.183059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-credential-keys\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.183366 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-scripts\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.184153 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-combined-ca-bundle\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.190930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-config-data\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.196581 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-combined-ca-bundle\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.197080 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-public-tls-certs\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.208367 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d9fa689-47c1-464b-82a8-9f047a084ed7-internal-tls-certs\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.216926 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-config-data\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.223940 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbkjn\" (UniqueName: \"kubernetes.io/projected/8d9fa689-47c1-464b-82a8-9f047a084ed7-kube-api-access-jbkjn\") pod \"keystone-7c69f8c7f-rlffj\" (UID: \"8d9fa689-47c1-464b-82a8-9f047a084ed7\") " pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.231022 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7tkd\" (UniqueName: \"kubernetes.io/projected/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-kube-api-access-k7tkd\") pod \"barbican-keystone-listener-678fbbd7fd-2hzg8\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258002 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-scripts\") pod \"09483d11-68f6-41a5-928d-dacd558d1d0d\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258047 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"09483d11-68f6-41a5-928d-dacd558d1d0d\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258080 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-config-data\") pod \"09483d11-68f6-41a5-928d-dacd558d1d0d\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258128 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp2tx\" (UniqueName: \"kubernetes.io/projected/09483d11-68f6-41a5-928d-dacd558d1d0d-kube-api-access-dp2tx\") pod \"09483d11-68f6-41a5-928d-dacd558d1d0d\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258148 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09483d11-68f6-41a5-928d-dacd558d1d0d-httpd-run\") pod \"09483d11-68f6-41a5-928d-dacd558d1d0d\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258182 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-combined-ca-bundle\") pod \"09483d11-68f6-41a5-928d-dacd558d1d0d\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258209 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09483d11-68f6-41a5-928d-dacd558d1d0d-logs\") pod \"09483d11-68f6-41a5-928d-dacd558d1d0d\" (UID: \"09483d11-68f6-41a5-928d-dacd558d1d0d\") " Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258440 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-config-data\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258470 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-config-data-custom\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258495 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-combined-ca-bundle\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4llj\" (UniqueName: \"kubernetes.io/projected/96e6e687-6b83-45b0-b616-aeafd9f0faa4-kube-api-access-g4llj\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258608 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e6e687-6b83-45b0-b616-aeafd9f0faa4-logs\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.258951 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e6e687-6b83-45b0-b616-aeafd9f0faa4-logs\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.262301 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09483d11-68f6-41a5-928d-dacd558d1d0d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "09483d11-68f6-41a5-928d-dacd558d1d0d" (UID: "09483d11-68f6-41a5-928d-dacd558d1d0d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.262505 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09483d11-68f6-41a5-928d-dacd558d1d0d-logs" (OuterVolumeSpecName: "logs") pod "09483d11-68f6-41a5-928d-dacd558d1d0d" (UID: "09483d11-68f6-41a5-928d-dacd558d1d0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.265110 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-k7b98"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.274055 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-combined-ca-bundle\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.285586 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "09483d11-68f6-41a5-928d-dacd558d1d0d" (UID: "09483d11-68f6-41a5-928d-dacd558d1d0d"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.287696 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-scripts" (OuterVolumeSpecName: "scripts") pod "09483d11-68f6-41a5-928d-dacd558d1d0d" (UID: "09483d11-68f6-41a5-928d-dacd558d1d0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.288185 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-config-data-custom\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.333304 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.335004 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-config-data\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.350243 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09483d11-68f6-41a5-928d-dacd558d1d0d-kube-api-access-dp2tx" (OuterVolumeSpecName: "kube-api-access-dp2tx") pod "09483d11-68f6-41a5-928d-dacd558d1d0d" (UID: "09483d11-68f6-41a5-928d-dacd558d1d0d"). InnerVolumeSpecName "kube-api-access-dp2tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.351730 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.384506 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp2tx\" (UniqueName: \"kubernetes.io/projected/09483d11-68f6-41a5-928d-dacd558d1d0d-kube-api-access-dp2tx\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.384547 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/09483d11-68f6-41a5-928d-dacd558d1d0d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.384558 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09483d11-68f6-41a5-928d-dacd558d1d0d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.384568 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.384602 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.418292 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-n5xlv"] Mar 20 15:11:56 crc kubenswrapper[4764]: E0320 15:11:56.418988 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09483d11-68f6-41a5-928d-dacd558d1d0d" containerName="glance-log" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.419003 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="09483d11-68f6-41a5-928d-dacd558d1d0d" containerName="glance-log" Mar 20 15:11:56 crc kubenswrapper[4764]: E0320 15:11:56.419034 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09483d11-68f6-41a5-928d-dacd558d1d0d" containerName="glance-httpd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.419041 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="09483d11-68f6-41a5-928d-dacd558d1d0d" containerName="glance-httpd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.419341 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="09483d11-68f6-41a5-928d-dacd558d1d0d" containerName="glance-httpd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.419367 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="09483d11-68f6-41a5-928d-dacd558d1d0d" containerName="glance-log" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.421653 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.422751 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09483d11-68f6-41a5-928d-dacd558d1d0d" (UID: "09483d11-68f6-41a5-928d-dacd558d1d0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.458241 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4llj\" (UniqueName: \"kubernetes.io/projected/96e6e687-6b83-45b0-b616-aeafd9f0faa4-kube-api-access-g4llj\") pod \"barbican-worker-6586fcc5bc-m65hd\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.480540 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5655889d58-4w5px"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.493259 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.493336 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c84b2\" (UniqueName: \"kubernetes.io/projected/df6da429-0f30-4521-b52e-304ca4830075-kube-api-access-c84b2\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.493366 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-dns-svc\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.493433 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.493495 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-config\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.493553 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.493631 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.503794 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.504045 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-config-data" (OuterVolumeSpecName: "config-data") pod "09483d11-68f6-41a5-928d-dacd558d1d0d" (UID: "09483d11-68f6-41a5-928d-dacd558d1d0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.509067 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.560440 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.566707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"09483d11-68f6-41a5-928d-dacd558d1d0d","Type":"ContainerDied","Data":"1d8f432924373d8d1c50d2cbbbb708603d980808d3db1e26fc0b2497d03a13a1"} Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.566758 4764 scope.go:117] "RemoveContainer" containerID="5dfca99798dbfee01df7dff1c8e661dbdd011cae247e8fa49cd3690234374f92" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.566854 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.574925 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-n5xlv"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595274 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-config\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595329 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6552d6d1-0522-4c30-9eba-b90d59482ea0-logs\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595356 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595400 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-combined-ca-bundle\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-config-data\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595466 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-config-data-custom\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595492 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c84b2\" (UniqueName: \"kubernetes.io/projected/df6da429-0f30-4521-b52e-304ca4830075-kube-api-access-c84b2\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595507 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-dns-svc\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595639 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz252\" (UniqueName: \"kubernetes.io/projected/6552d6d1-0522-4c30-9eba-b90d59482ea0-kube-api-access-tz252\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595693 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.595706 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09483d11-68f6-41a5-928d-dacd558d1d0d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.596536 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-config\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.602010 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.614343 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5655889d58-4w5px"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.617067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.619888 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbdf79ddf-5llj7" event={"ID":"07deed84-d17d-4b5f-955d-7087ecbc782d","Type":"ContainerStarted","Data":"cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41"} Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.619916 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbdf79ddf-5llj7" event={"ID":"07deed84-d17d-4b5f-955d-7087ecbc782d","Type":"ContainerStarted","Data":"ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847"} Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.620408 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.624097 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.624633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-dns-svc\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.632542 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c84b2\" (UniqueName: \"kubernetes.io/projected/df6da429-0f30-4521-b52e-304ca4830075-kube-api-access-c84b2\") pod \"dnsmasq-dns-85ff748b95-n5xlv\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.643534 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-9846b946b-trg4v"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.644981 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.677678 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.699864 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-config-data\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.699910 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-config-data-custom\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.700084 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5dda5ed-7a47-43f8-833f-a715aafe6c24-config-data\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.700174 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz252\" (UniqueName: \"kubernetes.io/projected/6552d6d1-0522-4c30-9eba-b90d59482ea0-kube-api-access-tz252\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.700230 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5dda5ed-7a47-43f8-833f-a715aafe6c24-logs\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.700262 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5dda5ed-7a47-43f8-833f-a715aafe6c24-config-data-custom\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.700307 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5dda5ed-7a47-43f8-833f-a715aafe6c24-combined-ca-bundle\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.700330 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6552d6d1-0522-4c30-9eba-b90d59482ea0-logs\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.700372 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pswhv\" (UniqueName: \"kubernetes.io/projected/a5dda5ed-7a47-43f8-833f-a715aafe6c24-kube-api-access-pswhv\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.700409 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-combined-ca-bundle\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.706464 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-9846b946b-trg4v"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.716829 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6552d6d1-0522-4c30-9eba-b90d59482ea0-logs\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.722497 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-58fcdd88fc-hzbh5"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.725892 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-config-data\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.727833 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-combined-ca-bundle\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.731759 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.731918 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-config-data-custom\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.736480 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-58fcdd88fc-hzbh5"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.752059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz252\" (UniqueName: \"kubernetes.io/projected/6552d6d1-0522-4c30-9eba-b90d59482ea0-kube-api-access-tz252\") pod \"barbican-api-5655889d58-4w5px\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.775137 4764 scope.go:117] "RemoveContainer" containerID="faab6afefbd81ed2e90a84036872532db8b8b95fa6eeac3299ef37725d4d0b43" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.788594 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-fbdf79ddf-5llj7" podStartSLOduration=4.788578485 podStartE2EDuration="4.788578485s" podCreationTimestamp="2026-03-20 15:11:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:56.6577535 +0000 UTC m=+1238.273942629" watchObservedRunningTime="2026-03-20 15:11:56.788578485 +0000 UTC m=+1238.404767614" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.802245 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5dda5ed-7a47-43f8-833f-a715aafe6c24-logs\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.802290 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5dda5ed-7a47-43f8-833f-a715aafe6c24-config-data-custom\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.802327 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5dda5ed-7a47-43f8-833f-a715aafe6c24-combined-ca-bundle\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.802358 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pswhv\" (UniqueName: \"kubernetes.io/projected/a5dda5ed-7a47-43f8-833f-a715aafe6c24-kube-api-access-pswhv\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.802408 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqn6x\" (UniqueName: \"kubernetes.io/projected/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-kube-api-access-lqn6x\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.802457 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-combined-ca-bundle\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.802479 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-config-data-custom\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.802504 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5dda5ed-7a47-43f8-833f-a715aafe6c24-config-data\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.802522 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-logs\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.802537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-config-data\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.802759 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5dda5ed-7a47-43f8-833f-a715aafe6c24-logs\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.824957 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5dda5ed-7a47-43f8-833f-a715aafe6c24-combined-ca-bundle\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.829146 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5dda5ed-7a47-43f8-833f-a715aafe6c24-config-data-custom\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.831699 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5dda5ed-7a47-43f8-833f-a715aafe6c24-config-data\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.840118 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pswhv\" (UniqueName: \"kubernetes.io/projected/a5dda5ed-7a47-43f8-833f-a715aafe6c24-kube-api-access-pswhv\") pod \"barbican-keystone-listener-9846b946b-trg4v\" (UID: \"a5dda5ed-7a47-43f8-833f-a715aafe6c24\") " pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.871879 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.892451 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-86db6c8bf4-mghmx"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.894027 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.903880 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.904444 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqn6x\" (UniqueName: \"kubernetes.io/projected/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-kube-api-access-lqn6x\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.904536 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-combined-ca-bundle\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.904562 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-config-data-custom\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.904597 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-logs\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.904614 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-config-data\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.914703 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-logs\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.917191 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-config-data\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.918059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-combined-ca-bundle\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.919573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-config-data-custom\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.919660 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86db6c8bf4-mghmx"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.935468 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.946802 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.958657 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.963749 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.965289 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.980834 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.980919 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.980953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqn6x\" (UniqueName: \"kubernetes.io/projected/b3fc5a72-a60e-4eef-be6a-aa4387e95c45-kube-api-access-lqn6x\") pod \"barbican-worker-58fcdd88fc-hzbh5\" (UID: \"b3fc5a72-a60e-4eef-be6a-aa4387e95c45\") " pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.981639 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-9846b946b-trg4v" Mar 20 15:11:56 crc kubenswrapper[4764]: I0320 15:11:56.989856 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.009799 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-58fcdd88fc-hzbh5" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.010909 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-combined-ca-bundle\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.010952 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/082cc398-d04e-430c-8de4-b1bc757cd290-logs\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.011428 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf4qx\" (UniqueName: \"kubernetes.io/projected/082cc398-d04e-430c-8de4-b1bc757cd290-kube-api-access-jf4qx\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.011496 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-config-data-custom\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.011540 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-config-data\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.114358 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-config-data\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.114963 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcszk\" (UniqueName: \"kubernetes.io/projected/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-kube-api-access-mcszk\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.115107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.116433 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.116466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-combined-ca-bundle\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.116538 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/082cc398-d04e-430c-8de4-b1bc757cd290-logs\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.116577 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-logs\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.116614 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.116673 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf4qx\" (UniqueName: \"kubernetes.io/projected/082cc398-d04e-430c-8de4-b1bc757cd290-kube-api-access-jf4qx\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.116705 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.116747 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.116813 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-config-data-custom\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.116849 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.117424 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/082cc398-d04e-430c-8de4-b1bc757cd290-logs\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.123098 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-config-data\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.124490 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-combined-ca-bundle\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.135031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-config-data-custom\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.162677 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf4qx\" (UniqueName: \"kubernetes.io/projected/082cc398-d04e-430c-8de4-b1bc757cd290-kube-api-access-jf4qx\") pod \"barbican-api-86db6c8bf4-mghmx\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.194450 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09483d11-68f6-41a5-928d-dacd558d1d0d" path="/var/lib/kubelet/pods/09483d11-68f6-41a5-928d-dacd558d1d0d/volumes" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.195415 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85fe2438-0a1e-4516-9da9-69b51291b035" path="/var/lib/kubelet/pods/85fe2438-0a1e-4516-9da9-69b51291b035/volumes" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.220610 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.220707 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-logs\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.220749 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.220790 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.220832 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.220866 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.220982 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcszk\" (UniqueName: \"kubernetes.io/projected/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-kube-api-access-mcszk\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.221010 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.222871 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.223242 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.223969 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-logs\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.232582 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.234937 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.234942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.236065 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.236713 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-678fbbd7fd-2hzg8"] Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.245786 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6bdffd5796-4724r"] Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.250314 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcszk\" (UniqueName: \"kubernetes.io/projected/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-kube-api-access-mcszk\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.326039 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.341090 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.368064 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.457086 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7c69f8c7f-rlffj"] Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.582126 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6586fcc5bc-m65hd"] Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.648212 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6586fcc5bc-m65hd" event={"ID":"96e6e687-6b83-45b0-b616-aeafd9f0faa4","Type":"ContainerStarted","Data":"e06f1e262f5d329a30b57b703e20b6b4d79addde2c20c4d3b6114e36301e112c"} Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.650394 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdffd5796-4724r" event={"ID":"6ac36915-9830-4e2c-871c-d8b56e780587","Type":"ContainerStarted","Data":"aa651dd5976b6201e5c33968256d526e6a79ab2b8dbaa07a6cc43993a4411ffd"} Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.699339 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c69f8c7f-rlffj" event={"ID":"8d9fa689-47c1-464b-82a8-9f047a084ed7","Type":"ContainerStarted","Data":"d1c4684475415c1e37a429ad6748dc55b0a5550a6112d03a1ad318eb48f5a693"} Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.730818 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" event={"ID":"2738d26f-7f78-45d6-a3e4-5ad8ac27c237","Type":"ContainerStarted","Data":"4a8ae80f73d8852b82e8d397426ecb149e483377595a75475ebf053325df20a7"} Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.742676 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-k7b98" podUID="51124ecf-d50f-478d-9a87-f81f7e72571e" containerName="dnsmasq-dns" containerID="cri-o://97824812d78276c9a797e70ec83ab2d4bcf450904c75ded8cfdd35ce2642d66b" gracePeriod=10 Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.742771 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f03615c4-2b7b-4db8-8706-c47f8399c808","Type":"ContainerStarted","Data":"0642b85b1a3de9d092fcfbfb02f0c0a1045297773c800c161ae5a49cbeb09525"} Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.821527 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-n5xlv"] Mar 20 15:11:57 crc kubenswrapper[4764]: E0320 15:11:57.954010 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51124ecf_d50f_478d_9a87_f81f7e72571e.slice/crio-97824812d78276c9a797e70ec83ab2d4bcf450904c75ded8cfdd35ce2642d66b.scope\": RecentStats: unable to find data in memory cache]" Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.956649 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5655889d58-4w5px"] Mar 20 15:11:57 crc kubenswrapper[4764]: I0320 15:11:57.987952 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-9846b946b-trg4v"] Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.152793 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-58fcdd88fc-hzbh5"] Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.280446 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86db6c8bf4-mghmx"] Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.367868 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:11:58 crc kubenswrapper[4764]: W0320 15:11:58.392259 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d39b3c2_aff3_4c9f_8e01_737c7ad92d68.slice/crio-82b07b19fa509096697ac2c0f8c8ab55c5820307bec6afaa4469e49f87f293a9 WatchSource:0}: Error finding container 82b07b19fa509096697ac2c0f8c8ab55c5820307bec6afaa4469e49f87f293a9: Status 404 returned error can't find the container with id 82b07b19fa509096697ac2c0f8c8ab55c5820307bec6afaa4469e49f87f293a9 Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.816918 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f03615c4-2b7b-4db8-8706-c47f8399c808","Type":"ContainerStarted","Data":"b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442"} Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.829514 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-9846b946b-trg4v" event={"ID":"a5dda5ed-7a47-43f8-833f-a715aafe6c24","Type":"ContainerStarted","Data":"b21a40f1334c92b9b8c57a617a2c02578d4f7fcad789df03ffc3d4efcb3bf913"} Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.844695 4764 generic.go:334] "Generic (PLEG): container finished" podID="51124ecf-d50f-478d-9a87-f81f7e72571e" containerID="97824812d78276c9a797e70ec83ab2d4bcf450904c75ded8cfdd35ce2642d66b" exitCode=0 Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.844754 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-k7b98" event={"ID":"51124ecf-d50f-478d-9a87-f81f7e72571e","Type":"ContainerDied","Data":"97824812d78276c9a797e70ec83ab2d4bcf450904c75ded8cfdd35ce2642d66b"} Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.856872 4764 generic.go:334] "Generic (PLEG): container finished" podID="df6da429-0f30-4521-b52e-304ca4830075" containerID="d4754e0959d3d674517bec732f88711ca67537308508737549020ef0054a280a" exitCode=0 Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.856923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" event={"ID":"df6da429-0f30-4521-b52e-304ca4830075","Type":"ContainerDied","Data":"d4754e0959d3d674517bec732f88711ca67537308508737549020ef0054a280a"} Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.856947 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" event={"ID":"df6da429-0f30-4521-b52e-304ca4830075","Type":"ContainerStarted","Data":"e2fddaa6816ed99043d5ab82702eb3e26bb70bae66bc978e1d6f532475e23fd4"} Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.865080 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5655889d58-4w5px" event={"ID":"6552d6d1-0522-4c30-9eba-b90d59482ea0","Type":"ContainerStarted","Data":"b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a"} Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.865112 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5655889d58-4w5px" event={"ID":"6552d6d1-0522-4c30-9eba-b90d59482ea0","Type":"ContainerStarted","Data":"f2a57239d01d03b74f07cf67661d0271196101b8774167d900a1ef879d143477"} Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.868095 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c69f8c7f-rlffj" event={"ID":"8d9fa689-47c1-464b-82a8-9f047a084ed7","Type":"ContainerStarted","Data":"1d2a13ba224befeaf44e61acc2178f0d05126266096b481ad0ec818590aea43e"} Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.868604 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.879429 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86db6c8bf4-mghmx" event={"ID":"082cc398-d04e-430c-8de4-b1bc757cd290","Type":"ContainerStarted","Data":"5087e4f79d02afc5c3a3c24d0d6f90f10eea76feb9b6c39c7018ac9c6782749c"} Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.925544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68","Type":"ContainerStarted","Data":"82b07b19fa509096697ac2c0f8c8ab55c5820307bec6afaa4469e49f87f293a9"} Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.958288 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7c69f8c7f-rlffj" podStartSLOduration=3.958269817 podStartE2EDuration="3.958269817s" podCreationTimestamp="2026-03-20 15:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:11:58.925991332 +0000 UTC m=+1240.542180471" watchObservedRunningTime="2026-03-20 15:11:58.958269817 +0000 UTC m=+1240.574458946" Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.961412 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58fcdd88fc-hzbh5" event={"ID":"b3fc5a72-a60e-4eef-be6a-aa4387e95c45","Type":"ContainerStarted","Data":"633e9bbd971ac9ad86f006580c9069f01c73d21568a4c3c818727031e217920f"} Mar 20 15:11:58 crc kubenswrapper[4764]: I0320 15:11:58.991138 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdffd5796-4724r" event={"ID":"6ac36915-9830-4e2c-871c-d8b56e780587","Type":"ContainerStarted","Data":"c0a5f6cb0134ea2c57541759e37b1f4118d99daab52ac20c28ed4d50a48e572e"} Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.186060 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.306775 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-dns-swift-storage-0\") pod \"51124ecf-d50f-478d-9a87-f81f7e72571e\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.306862 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-dns-svc\") pod \"51124ecf-d50f-478d-9a87-f81f7e72571e\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.307556 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hngmv\" (UniqueName: \"kubernetes.io/projected/51124ecf-d50f-478d-9a87-f81f7e72571e-kube-api-access-hngmv\") pod \"51124ecf-d50f-478d-9a87-f81f7e72571e\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.307586 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-ovsdbserver-nb\") pod \"51124ecf-d50f-478d-9a87-f81f7e72571e\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.307799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-ovsdbserver-sb\") pod \"51124ecf-d50f-478d-9a87-f81f7e72571e\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.307827 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-config\") pod \"51124ecf-d50f-478d-9a87-f81f7e72571e\" (UID: \"51124ecf-d50f-478d-9a87-f81f7e72571e\") " Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.347645 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51124ecf-d50f-478d-9a87-f81f7e72571e-kube-api-access-hngmv" (OuterVolumeSpecName: "kube-api-access-hngmv") pod "51124ecf-d50f-478d-9a87-f81f7e72571e" (UID: "51124ecf-d50f-478d-9a87-f81f7e72571e"). InnerVolumeSpecName "kube-api-access-hngmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.406599 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "51124ecf-d50f-478d-9a87-f81f7e72571e" (UID: "51124ecf-d50f-478d-9a87-f81f7e72571e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.416448 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.416680 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hngmv\" (UniqueName: \"kubernetes.io/projected/51124ecf-d50f-478d-9a87-f81f7e72571e-kube-api-access-hngmv\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.483687 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "51124ecf-d50f-478d-9a87-f81f7e72571e" (UID: "51124ecf-d50f-478d-9a87-f81f7e72571e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.518994 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "51124ecf-d50f-478d-9a87-f81f7e72571e" (UID: "51124ecf-d50f-478d-9a87-f81f7e72571e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.519551 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.519582 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.531396 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "51124ecf-d50f-478d-9a87-f81f7e72571e" (UID: "51124ecf-d50f-478d-9a87-f81f7e72571e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.531861 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-config" (OuterVolumeSpecName: "config") pod "51124ecf-d50f-478d-9a87-f81f7e72571e" (UID: "51124ecf-d50f-478d-9a87-f81f7e72571e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.622458 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:11:59 crc kubenswrapper[4764]: I0320 15:11:59.622509 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51124ecf-d50f-478d-9a87-f81f7e72571e-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.032428 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-k7b98" event={"ID":"51124ecf-d50f-478d-9a87-f81f7e72571e","Type":"ContainerDied","Data":"6b7515d1aad8ed273484631b9ab4ece7f4f480327fc336977b743630c7c9a158"} Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.032747 4764 scope.go:117] "RemoveContainer" containerID="97824812d78276c9a797e70ec83ab2d4bcf450904c75ded8cfdd35ce2642d66b" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.032460 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-k7b98" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.056900 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdffd5796-4724r" event={"ID":"6ac36915-9830-4e2c-871c-d8b56e780587","Type":"ContainerStarted","Data":"1d19694c192b43ec439c344a64633b37c479312729679e189d629323589abd8a"} Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.058354 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.058419 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.074485 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" event={"ID":"df6da429-0f30-4521-b52e-304ca4830075","Type":"ContainerStarted","Data":"1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07"} Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.075147 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.082372 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-k7b98"] Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.086210 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86db6c8bf4-mghmx" event={"ID":"082cc398-d04e-430c-8de4-b1bc757cd290","Type":"ContainerStarted","Data":"c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643"} Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.086273 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86db6c8bf4-mghmx" event={"ID":"082cc398-d04e-430c-8de4-b1bc757cd290","Type":"ContainerStarted","Data":"93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4"} Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.086763 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.086910 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.103859 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5655889d58-4w5px" event={"ID":"6552d6d1-0522-4c30-9eba-b90d59482ea0","Type":"ContainerStarted","Data":"8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b"} Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.104212 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.104326 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.106706 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-k7b98"] Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.116056 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f03615c4-2b7b-4db8-8706-c47f8399c808","Type":"ContainerStarted","Data":"07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de"} Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.118797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68","Type":"ContainerStarted","Data":"4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b"} Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.128142 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6bdffd5796-4724r" podStartSLOduration=5.128124925 podStartE2EDuration="5.128124925s" podCreationTimestamp="2026-03-20 15:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:00.103934036 +0000 UTC m=+1241.720123165" watchObservedRunningTime="2026-03-20 15:12:00.128124925 +0000 UTC m=+1241.744314054" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.128234 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5655889d58-4w5px"] Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.163818 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" podStartSLOduration=4.163799145 podStartE2EDuration="4.163799145s" podCreationTimestamp="2026-03-20 15:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:00.130122647 +0000 UTC m=+1241.746311776" watchObservedRunningTime="2026-03-20 15:12:00.163799145 +0000 UTC m=+1241.779988274" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.179536 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-58988f9f54-f58q6"] Mar 20 15:12:00 crc kubenswrapper[4764]: E0320 15:12:00.186233 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51124ecf-d50f-478d-9a87-f81f7e72571e" containerName="init" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.186268 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="51124ecf-d50f-478d-9a87-f81f7e72571e" containerName="init" Mar 20 15:12:00 crc kubenswrapper[4764]: E0320 15:12:00.186324 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51124ecf-d50f-478d-9a87-f81f7e72571e" containerName="dnsmasq-dns" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.186331 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="51124ecf-d50f-478d-9a87-f81f7e72571e" containerName="dnsmasq-dns" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.187081 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="51124ecf-d50f-478d-9a87-f81f7e72571e" containerName="dnsmasq-dns" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.188587 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.190363 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.213877 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566992-6qwzt"] Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.215595 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566992-6qwzt" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.225807 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.227035 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.230573 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.230717 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.233005 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-public-tls-certs\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.235920 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-config-data-custom\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.236246 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-combined-ca-bundle\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.236481 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-internal-tls-certs\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.236636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pjcp\" (UniqueName: \"kubernetes.io/projected/8ed91bcd-a582-4c3c-893d-a1f081c657ee-kube-api-access-9pjcp\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.236864 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-config-data\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.236891 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed91bcd-a582-4c3c-893d-a1f081c657ee-logs\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.262693 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-86db6c8bf4-mghmx" podStartSLOduration=4.262671704 podStartE2EDuration="4.262671704s" podCreationTimestamp="2026-03-20 15:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:00.154749718 +0000 UTC m=+1241.770938847" watchObservedRunningTime="2026-03-20 15:12:00.262671704 +0000 UTC m=+1241.878860833" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.308230 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58988f9f54-f58q6"] Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.311957 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5655889d58-4w5px" podStartSLOduration=4.311922259 podStartE2EDuration="4.311922259s" podCreationTimestamp="2026-03-20 15:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:00.173703337 +0000 UTC m=+1241.789892456" watchObservedRunningTime="2026-03-20 15:12:00.311922259 +0000 UTC m=+1241.928111378" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.334234 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566992-6qwzt"] Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.338669 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-internal-tls-certs\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.338880 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pjcp\" (UniqueName: \"kubernetes.io/projected/8ed91bcd-a582-4c3c-893d-a1f081c657ee-kube-api-access-9pjcp\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.338929 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-config-data\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.338981 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed91bcd-a582-4c3c-893d-a1f081c657ee-logs\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.339081 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr9sc\" (UniqueName: \"kubernetes.io/projected/a667b00c-fedc-470d-adb4-309d0a96a676-kube-api-access-dr9sc\") pod \"auto-csr-approver-29566992-6qwzt\" (UID: \"a667b00c-fedc-470d-adb4-309d0a96a676\") " pod="openshift-infra/auto-csr-approver-29566992-6qwzt" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.339133 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-public-tls-certs\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.339161 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-config-data-custom\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.339236 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-combined-ca-bundle\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.340048 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ed91bcd-a582-4c3c-893d-a1f081c657ee-logs\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.346702 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-public-tls-certs\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.352644 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-config-data\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.378979 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.3789617960000005 podStartE2EDuration="5.378961796s" podCreationTimestamp="2026-03-20 15:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:00.256250617 +0000 UTC m=+1241.872439746" watchObservedRunningTime="2026-03-20 15:12:00.378961796 +0000 UTC m=+1241.995150925" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.390627 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pjcp\" (UniqueName: \"kubernetes.io/projected/8ed91bcd-a582-4c3c-893d-a1f081c657ee-kube-api-access-9pjcp\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.390834 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-combined-ca-bundle\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.390993 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-config-data-custom\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.391056 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed91bcd-a582-4c3c-893d-a1f081c657ee-internal-tls-certs\") pod \"barbican-api-58988f9f54-f58q6\" (UID: \"8ed91bcd-a582-4c3c-893d-a1f081c657ee\") " pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.442721 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr9sc\" (UniqueName: \"kubernetes.io/projected/a667b00c-fedc-470d-adb4-309d0a96a676-kube-api-access-dr9sc\") pod \"auto-csr-approver-29566992-6qwzt\" (UID: \"a667b00c-fedc-470d-adb4-309d0a96a676\") " pod="openshift-infra/auto-csr-approver-29566992-6qwzt" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.462095 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr9sc\" (UniqueName: \"kubernetes.io/projected/a667b00c-fedc-470d-adb4-309d0a96a676-kube-api-access-dr9sc\") pod \"auto-csr-approver-29566992-6qwzt\" (UID: \"a667b00c-fedc-470d-adb4-309d0a96a676\") " pod="openshift-infra/auto-csr-approver-29566992-6qwzt" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.521206 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:00 crc kubenswrapper[4764]: I0320 15:12:00.543052 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566992-6qwzt" Mar 20 15:12:01 crc kubenswrapper[4764]: I0320 15:12:01.075943 4764 scope.go:117] "RemoveContainer" containerID="4dbfdccd835db6fc255f853553a30417cb5aa92d0d331b3563aafe0f6022a66f" Mar 20 15:12:01 crc kubenswrapper[4764]: I0320 15:12:01.142464 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51124ecf-d50f-478d-9a87-f81f7e72571e" path="/var/lib/kubelet/pods/51124ecf-d50f-478d-9a87-f81f7e72571e/volumes" Mar 20 15:12:01 crc kubenswrapper[4764]: I0320 15:12:01.867788 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58988f9f54-f58q6"] Mar 20 15:12:01 crc kubenswrapper[4764]: W0320 15:12:01.892279 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ed91bcd_a582_4c3c_893d_a1f081c657ee.slice/crio-82080ac11a4e186962fd8662e1d0d947aa87d4a353b44d9a7713de9087c56a68 WatchSource:0}: Error finding container 82080ac11a4e186962fd8662e1d0d947aa87d4a353b44d9a7713de9087c56a68: Status 404 returned error can't find the container with id 82080ac11a4e186962fd8662e1d0d947aa87d4a353b44d9a7713de9087c56a68 Mar 20 15:12:01 crc kubenswrapper[4764]: I0320 15:12:01.954317 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566992-6qwzt"] Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.206831 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566992-6qwzt" event={"ID":"a667b00c-fedc-470d-adb4-309d0a96a676","Type":"ContainerStarted","Data":"b150274151ee26a91ce2aa1126d061a5f947fdb476d5b0402c79bf7e2d7ec1ed"} Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.214143 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6586fcc5bc-m65hd" event={"ID":"96e6e687-6b83-45b0-b616-aeafd9f0faa4","Type":"ContainerStarted","Data":"bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce"} Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.214176 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6586fcc5bc-m65hd" event={"ID":"96e6e687-6b83-45b0-b616-aeafd9f0faa4","Type":"ContainerStarted","Data":"d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61"} Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.216328 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58988f9f54-f58q6" event={"ID":"8ed91bcd-a582-4c3c-893d-a1f081c657ee","Type":"ContainerStarted","Data":"82080ac11a4e186962fd8662e1d0d947aa87d4a353b44d9a7713de9087c56a68"} Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.220303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" event={"ID":"2738d26f-7f78-45d6-a3e4-5ad8ac27c237","Type":"ContainerStarted","Data":"d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04"} Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.229127 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5655889d58-4w5px" podUID="6552d6d1-0522-4c30-9eba-b90d59482ea0" containerName="barbican-api-log" containerID="cri-o://b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a" gracePeriod=30 Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.230185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-9846b946b-trg4v" event={"ID":"a5dda5ed-7a47-43f8-833f-a715aafe6c24","Type":"ContainerStarted","Data":"a506a41a68a8ed51cfed535e8700257df5f49a0f5dbd807a44acecb6c7a65a81"} Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.230208 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-9846b946b-trg4v" event={"ID":"a5dda5ed-7a47-43f8-833f-a715aafe6c24","Type":"ContainerStarted","Data":"54738808568bcc269fc3a9ee60726bdf3d4331c1afed5614a87f4eedc51a1511"} Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.230255 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5655889d58-4w5px" podUID="6552d6d1-0522-4c30-9eba-b90d59482ea0" containerName="barbican-api" containerID="cri-o://8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b" gracePeriod=30 Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.232459 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6586fcc5bc-m65hd" podStartSLOduration=3.643285768 podStartE2EDuration="7.232439092s" podCreationTimestamp="2026-03-20 15:11:55 +0000 UTC" firstStartedPulling="2026-03-20 15:11:57.59132406 +0000 UTC m=+1239.207513189" lastFinishedPulling="2026-03-20 15:12:01.180477384 +0000 UTC m=+1242.796666513" observedRunningTime="2026-03-20 15:12:02.229301186 +0000 UTC m=+1243.845490315" watchObservedRunningTime="2026-03-20 15:12:02.232439092 +0000 UTC m=+1243.848628221" Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.265459 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-9846b946b-trg4v" podStartSLOduration=2.905798345 podStartE2EDuration="6.265441819s" podCreationTimestamp="2026-03-20 15:11:56 +0000 UTC" firstStartedPulling="2026-03-20 15:11:58.039339593 +0000 UTC m=+1239.655528722" lastFinishedPulling="2026-03-20 15:12:01.398983057 +0000 UTC m=+1243.015172196" observedRunningTime="2026-03-20 15:12:02.263667134 +0000 UTC m=+1243.879856263" watchObservedRunningTime="2026-03-20 15:12:02.265441819 +0000 UTC m=+1243.881630938" Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.307887 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" podStartSLOduration=3.378656485 podStartE2EDuration="7.307865285s" podCreationTimestamp="2026-03-20 15:11:55 +0000 UTC" firstStartedPulling="2026-03-20 15:11:57.320786668 +0000 UTC m=+1238.936975797" lastFinishedPulling="2026-03-20 15:12:01.249995468 +0000 UTC m=+1242.866184597" observedRunningTime="2026-03-20 15:12:02.283608624 +0000 UTC m=+1243.899797753" watchObservedRunningTime="2026-03-20 15:12:02.307865285 +0000 UTC m=+1243.924054414" Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.371840 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-678fbbd7fd-2hzg8"] Mar 20 15:12:02 crc kubenswrapper[4764]: I0320 15:12:02.919128 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.038136 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz252\" (UniqueName: \"kubernetes.io/projected/6552d6d1-0522-4c30-9eba-b90d59482ea0-kube-api-access-tz252\") pod \"6552d6d1-0522-4c30-9eba-b90d59482ea0\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.038554 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-config-data\") pod \"6552d6d1-0522-4c30-9eba-b90d59482ea0\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.038574 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-combined-ca-bundle\") pod \"6552d6d1-0522-4c30-9eba-b90d59482ea0\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.038601 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-config-data-custom\") pod \"6552d6d1-0522-4c30-9eba-b90d59482ea0\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.038725 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6552d6d1-0522-4c30-9eba-b90d59482ea0-logs\") pod \"6552d6d1-0522-4c30-9eba-b90d59482ea0\" (UID: \"6552d6d1-0522-4c30-9eba-b90d59482ea0\") " Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.047921 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6552d6d1-0522-4c30-9eba-b90d59482ea0-kube-api-access-tz252" (OuterVolumeSpecName: "kube-api-access-tz252") pod "6552d6d1-0522-4c30-9eba-b90d59482ea0" (UID: "6552d6d1-0522-4c30-9eba-b90d59482ea0"). InnerVolumeSpecName "kube-api-access-tz252". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.058997 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6552d6d1-0522-4c30-9eba-b90d59482ea0-logs" (OuterVolumeSpecName: "logs") pod "6552d6d1-0522-4c30-9eba-b90d59482ea0" (UID: "6552d6d1-0522-4c30-9eba-b90d59482ea0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.074518 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6552d6d1-0522-4c30-9eba-b90d59482ea0" (UID: "6552d6d1-0522-4c30-9eba-b90d59482ea0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.127569 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6552d6d1-0522-4c30-9eba-b90d59482ea0" (UID: "6552d6d1-0522-4c30-9eba-b90d59482ea0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.140514 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz252\" (UniqueName: \"kubernetes.io/projected/6552d6d1-0522-4c30-9eba-b90d59482ea0-kube-api-access-tz252\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.140569 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.140579 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.140588 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6552d6d1-0522-4c30-9eba-b90d59482ea0-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.186654 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-config-data" (OuterVolumeSpecName: "config-data") pod "6552d6d1-0522-4c30-9eba-b90d59482ea0" (UID: "6552d6d1-0522-4c30-9eba-b90d59482ea0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.242180 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6552d6d1-0522-4c30-9eba-b90d59482ea0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.246751 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58fcdd88fc-hzbh5" event={"ID":"b3fc5a72-a60e-4eef-be6a-aa4387e95c45","Type":"ContainerStarted","Data":"8f42f4b376ee1de3f1d87160b85b29dc41c7342780fbf5ad99a5caf5823aded3"} Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.246794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58fcdd88fc-hzbh5" event={"ID":"b3fc5a72-a60e-4eef-be6a-aa4387e95c45","Type":"ContainerStarted","Data":"3bccdb147d9c85e62f4c10d0a128d0e6769d443b4f6815d183c2ffc6505ae450"} Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.252172 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68","Type":"ContainerStarted","Data":"111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b"} Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.257523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58988f9f54-f58q6" event={"ID":"8ed91bcd-a582-4c3c-893d-a1f081c657ee","Type":"ContainerStarted","Data":"3b515b6408e75775cb6888e4494c539a5f5e85c5e4fd65c82edd46d325aac86c"} Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.257770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58988f9f54-f58q6" event={"ID":"8ed91bcd-a582-4c3c-893d-a1f081c657ee","Type":"ContainerStarted","Data":"b323ed0850ec7c3a50711549a2d2dd9f242a960876ff302ad8fd707413555e73"} Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.257862 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.258664 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.261699 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6zj6m" event={"ID":"337e2278-00e7-428e-97c1-c8d940d83aa4","Type":"ContainerStarted","Data":"c29b5ce861c58e63c1aa72cf4eda65bae641e857b78531c017dca685c40e85b3"} Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.262247 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-58fcdd88fc-hzbh5" podStartSLOduration=3.674178032 podStartE2EDuration="7.262232351s" podCreationTimestamp="2026-03-20 15:11:56 +0000 UTC" firstStartedPulling="2026-03-20 15:11:58.192028397 +0000 UTC m=+1239.808217526" lastFinishedPulling="2026-03-20 15:12:01.780082716 +0000 UTC m=+1243.396271845" observedRunningTime="2026-03-20 15:12:03.262081957 +0000 UTC m=+1244.878271086" watchObservedRunningTime="2026-03-20 15:12:03.262232351 +0000 UTC m=+1244.878421480" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.270699 4764 generic.go:334] "Generic (PLEG): container finished" podID="6552d6d1-0522-4c30-9eba-b90d59482ea0" containerID="8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b" exitCode=0 Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.270730 4764 generic.go:334] "Generic (PLEG): container finished" podID="6552d6d1-0522-4c30-9eba-b90d59482ea0" containerID="b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a" exitCode=143 Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.270797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5655889d58-4w5px" event={"ID":"6552d6d1-0522-4c30-9eba-b90d59482ea0","Type":"ContainerDied","Data":"8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b"} Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.270826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5655889d58-4w5px" event={"ID":"6552d6d1-0522-4c30-9eba-b90d59482ea0","Type":"ContainerDied","Data":"b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a"} Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.270836 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5655889d58-4w5px" event={"ID":"6552d6d1-0522-4c30-9eba-b90d59482ea0","Type":"ContainerDied","Data":"f2a57239d01d03b74f07cf67661d0271196101b8774167d900a1ef879d143477"} Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.270851 4764 scope.go:117] "RemoveContainer" containerID="8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.270990 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5655889d58-4w5px" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.282944 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" event={"ID":"2738d26f-7f78-45d6-a3e4-5ad8ac27c237","Type":"ContainerStarted","Data":"ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e"} Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.288144 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-58988f9f54-f58q6" podStartSLOduration=3.288127943 podStartE2EDuration="3.288127943s" podCreationTimestamp="2026-03-20 15:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:03.283539682 +0000 UTC m=+1244.899728811" watchObservedRunningTime="2026-03-20 15:12:03.288127943 +0000 UTC m=+1244.904317072" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.317220 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6586fcc5bc-m65hd"] Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.327106 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.327084942 podStartE2EDuration="7.327084942s" podCreationTimestamp="2026-03-20 15:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:03.312093564 +0000 UTC m=+1244.928282703" watchObservedRunningTime="2026-03-20 15:12:03.327084942 +0000 UTC m=+1244.943274071" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.350078 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6zj6m" podStartSLOduration=4.062588989 podStartE2EDuration="48.350057133s" podCreationTimestamp="2026-03-20 15:11:15 +0000 UTC" firstStartedPulling="2026-03-20 15:11:16.963295046 +0000 UTC m=+1198.579484165" lastFinishedPulling="2026-03-20 15:12:01.25076318 +0000 UTC m=+1242.866952309" observedRunningTime="2026-03-20 15:12:03.335776977 +0000 UTC m=+1244.951966126" watchObservedRunningTime="2026-03-20 15:12:03.350057133 +0000 UTC m=+1244.966246262" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.358097 4764 scope.go:117] "RemoveContainer" containerID="b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.383518 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5655889d58-4w5px"] Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.391769 4764 scope.go:117] "RemoveContainer" containerID="8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b" Mar 20 15:12:03 crc kubenswrapper[4764]: E0320 15:12:03.393030 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b\": container with ID starting with 8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b not found: ID does not exist" containerID="8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.393068 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b"} err="failed to get container status \"8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b\": rpc error: code = NotFound desc = could not find container \"8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b\": container with ID starting with 8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b not found: ID does not exist" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.393091 4764 scope.go:117] "RemoveContainer" containerID="b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a" Mar 20 15:12:03 crc kubenswrapper[4764]: E0320 15:12:03.396992 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a\": container with ID starting with b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a not found: ID does not exist" containerID="b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.397220 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a"} err="failed to get container status \"b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a\": rpc error: code = NotFound desc = could not find container \"b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a\": container with ID starting with b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a not found: ID does not exist" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.397525 4764 scope.go:117] "RemoveContainer" containerID="8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.397664 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5655889d58-4w5px"] Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.403610 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b"} err="failed to get container status \"8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b\": rpc error: code = NotFound desc = could not find container \"8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b\": container with ID starting with 8ec7240baa821156c11ecaa89ca8168c6f92abddaf12c496aeab7f695557f49b not found: ID does not exist" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.403663 4764 scope.go:117] "RemoveContainer" containerID="b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a" Mar 20 15:12:03 crc kubenswrapper[4764]: I0320 15:12:03.407300 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a"} err="failed to get container status \"b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a\": rpc error: code = NotFound desc = could not find container \"b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a\": container with ID starting with b999ad6bc1aa582e7dfe8f7ac44944e28edb2162f5baa5dd6c4eb3562ef0004a not found: ID does not exist" Mar 20 15:12:04 crc kubenswrapper[4764]: I0320 15:12:04.292541 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566992-6qwzt" event={"ID":"a667b00c-fedc-470d-adb4-309d0a96a676","Type":"ContainerStarted","Data":"bb41983f78feb0ec34c6f2d9b09214cc19ae10c2f196f086cebb67423e5791b1"} Mar 20 15:12:04 crc kubenswrapper[4764]: I0320 15:12:04.295135 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6586fcc5bc-m65hd" podUID="96e6e687-6b83-45b0-b616-aeafd9f0faa4" containerName="barbican-worker-log" containerID="cri-o://d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61" gracePeriod=30 Mar 20 15:12:04 crc kubenswrapper[4764]: I0320 15:12:04.295174 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6586fcc5bc-m65hd" podUID="96e6e687-6b83-45b0-b616-aeafd9f0faa4" containerName="barbican-worker" containerID="cri-o://bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce" gracePeriod=30 Mar 20 15:12:04 crc kubenswrapper[4764]: I0320 15:12:04.296276 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" podUID="2738d26f-7f78-45d6-a3e4-5ad8ac27c237" containerName="barbican-keystone-listener-log" containerID="cri-o://d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04" gracePeriod=30 Mar 20 15:12:04 crc kubenswrapper[4764]: I0320 15:12:04.296397 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" podUID="2738d26f-7f78-45d6-a3e4-5ad8ac27c237" containerName="barbican-keystone-listener" containerID="cri-o://ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e" gracePeriod=30 Mar 20 15:12:04 crc kubenswrapper[4764]: I0320 15:12:04.310761 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566992-6qwzt" podStartSLOduration=2.998557508 podStartE2EDuration="4.310746513s" podCreationTimestamp="2026-03-20 15:12:00 +0000 UTC" firstStartedPulling="2026-03-20 15:12:01.968789859 +0000 UTC m=+1243.584978988" lastFinishedPulling="2026-03-20 15:12:03.280978864 +0000 UTC m=+1244.897167993" observedRunningTime="2026-03-20 15:12:04.30639508 +0000 UTC m=+1245.922584209" watchObservedRunningTime="2026-03-20 15:12:04.310746513 +0000 UTC m=+1245.926935642" Mar 20 15:12:04 crc kubenswrapper[4764]: I0320 15:12:04.754569 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66899c9d8-zh5gp" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 20 15:12:04 crc kubenswrapper[4764]: I0320 15:12:04.789355 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-655785589d-5cnb4" podUID="cb148fab-0227-4725-af4e-d6dba5740303" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Mar 20 15:12:05 crc kubenswrapper[4764]: I0320 15:12:05.143396 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6552d6d1-0522-4c30-9eba-b90d59482ea0" path="/var/lib/kubelet/pods/6552d6d1-0522-4c30-9eba-b90d59482ea0/volumes" Mar 20 15:12:05 crc kubenswrapper[4764]: I0320 15:12:05.304983 4764 generic.go:334] "Generic (PLEG): container finished" podID="96e6e687-6b83-45b0-b616-aeafd9f0faa4" containerID="d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61" exitCode=143 Mar 20 15:12:05 crc kubenswrapper[4764]: I0320 15:12:05.305048 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6586fcc5bc-m65hd" event={"ID":"96e6e687-6b83-45b0-b616-aeafd9f0faa4","Type":"ContainerDied","Data":"d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61"} Mar 20 15:12:05 crc kubenswrapper[4764]: I0320 15:12:05.307423 4764 generic.go:334] "Generic (PLEG): container finished" podID="a667b00c-fedc-470d-adb4-309d0a96a676" containerID="bb41983f78feb0ec34c6f2d9b09214cc19ae10c2f196f086cebb67423e5791b1" exitCode=0 Mar 20 15:12:05 crc kubenswrapper[4764]: I0320 15:12:05.307505 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566992-6qwzt" event={"ID":"a667b00c-fedc-470d-adb4-309d0a96a676","Type":"ContainerDied","Data":"bb41983f78feb0ec34c6f2d9b09214cc19ae10c2f196f086cebb67423e5791b1"} Mar 20 15:12:05 crc kubenswrapper[4764]: I0320 15:12:05.309610 4764 generic.go:334] "Generic (PLEG): container finished" podID="2738d26f-7f78-45d6-a3e4-5ad8ac27c237" containerID="d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04" exitCode=143 Mar 20 15:12:05 crc kubenswrapper[4764]: I0320 15:12:05.310505 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" event={"ID":"2738d26f-7f78-45d6-a3e4-5ad8ac27c237","Type":"ContainerDied","Data":"d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04"} Mar 20 15:12:05 crc kubenswrapper[4764]: I0320 15:12:05.716058 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 15:12:05 crc kubenswrapper[4764]: I0320 15:12:05.717589 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 15:12:05 crc kubenswrapper[4764]: I0320 15:12:05.759312 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 15:12:05 crc kubenswrapper[4764]: I0320 15:12:05.770729 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 15:12:06 crc kubenswrapper[4764]: I0320 15:12:06.319027 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 15:12:06 crc kubenswrapper[4764]: I0320 15:12:06.319099 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 15:12:06 crc kubenswrapper[4764]: I0320 15:12:06.874525 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:12:06 crc kubenswrapper[4764]: I0320 15:12:06.933596 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-n6p2n"] Mar 20 15:12:06 crc kubenswrapper[4764]: I0320 15:12:06.933843 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" podUID="be95591c-8398-4ba2-aa65-784bc64cc1b3" containerName="dnsmasq-dns" containerID="cri-o://02e7741578177972eef3b20819a82b9e605a6acb90680547cb62400d919c277d" gracePeriod=10 Mar 20 15:12:07 crc kubenswrapper[4764]: I0320 15:12:07.086460 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" podUID="be95591c-8398-4ba2-aa65-784bc64cc1b3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Mar 20 15:12:07 crc kubenswrapper[4764]: I0320 15:12:07.332712 4764 generic.go:334] "Generic (PLEG): container finished" podID="be95591c-8398-4ba2-aa65-784bc64cc1b3" containerID="02e7741578177972eef3b20819a82b9e605a6acb90680547cb62400d919c277d" exitCode=0 Mar 20 15:12:07 crc kubenswrapper[4764]: I0320 15:12:07.333552 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" event={"ID":"be95591c-8398-4ba2-aa65-784bc64cc1b3","Type":"ContainerDied","Data":"02e7741578177972eef3b20819a82b9e605a6acb90680547cb62400d919c277d"} Mar 20 15:12:07 crc kubenswrapper[4764]: I0320 15:12:07.369256 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:07 crc kubenswrapper[4764]: I0320 15:12:07.369296 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:07 crc kubenswrapper[4764]: I0320 15:12:07.413454 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:07 crc kubenswrapper[4764]: I0320 15:12:07.429109 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:08 crc kubenswrapper[4764]: I0320 15:12:08.342031 4764 generic.go:334] "Generic (PLEG): container finished" podID="337e2278-00e7-428e-97c1-c8d940d83aa4" containerID="c29b5ce861c58e63c1aa72cf4eda65bae641e857b78531c017dca685c40e85b3" exitCode=0 Mar 20 15:12:08 crc kubenswrapper[4764]: I0320 15:12:08.342070 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6zj6m" event={"ID":"337e2278-00e7-428e-97c1-c8d940d83aa4","Type":"ContainerDied","Data":"c29b5ce861c58e63c1aa72cf4eda65bae641e857b78531c017dca685c40e85b3"} Mar 20 15:12:08 crc kubenswrapper[4764]: I0320 15:12:08.342573 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:08 crc kubenswrapper[4764]: I0320 15:12:08.342602 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:08 crc kubenswrapper[4764]: I0320 15:12:08.669161 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 15:12:08 crc kubenswrapper[4764]: I0320 15:12:08.669263 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:12:08 crc kubenswrapper[4764]: I0320 15:12:08.819433 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:12:08 crc kubenswrapper[4764]: I0320 15:12:08.947737 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:12:09 crc kubenswrapper[4764]: I0320 15:12:09.741207 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 15:12:10 crc kubenswrapper[4764]: I0320 15:12:10.653494 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:10 crc kubenswrapper[4764]: I0320 15:12:10.653976 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:12:10 crc kubenswrapper[4764]: I0320 15:12:10.977854 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:12 crc kubenswrapper[4764]: I0320 15:12:12.077583 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" podUID="be95591c-8398-4ba2-aa65-784bc64cc1b3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Mar 20 15:12:12 crc kubenswrapper[4764]: I0320 15:12:12.491790 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:12 crc kubenswrapper[4764]: I0320 15:12:12.901287 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58988f9f54-f58q6" Mar 20 15:12:12 crc kubenswrapper[4764]: I0320 15:12:12.969718 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:12:12 crc kubenswrapper[4764]: I0320 15:12:12.971449 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86db6c8bf4-mghmx"] Mar 20 15:12:12 crc kubenswrapper[4764]: I0320 15:12:12.971727 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86db6c8bf4-mghmx" podUID="082cc398-d04e-430c-8de4-b1bc757cd290" containerName="barbican-api-log" containerID="cri-o://93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4" gracePeriod=30 Mar 20 15:12:12 crc kubenswrapper[4764]: I0320 15:12:12.971942 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86db6c8bf4-mghmx" podUID="082cc398-d04e-430c-8de4-b1bc757cd290" containerName="barbican-api" containerID="cri-o://c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643" gracePeriod=30 Mar 20 15:12:12 crc kubenswrapper[4764]: I0320 15:12:12.981192 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-86db6c8bf4-mghmx" podUID="082cc398-d04e-430c-8de4-b1bc757cd290" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": EOF" Mar 20 15:12:12 crc kubenswrapper[4764]: I0320 15:12:12.981630 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566992-6qwzt" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.071497 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr9sc\" (UniqueName: \"kubernetes.io/projected/a667b00c-fedc-470d-adb4-309d0a96a676-kube-api-access-dr9sc\") pod \"a667b00c-fedc-470d-adb4-309d0a96a676\" (UID: \"a667b00c-fedc-470d-adb4-309d0a96a676\") " Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.072585 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-scripts\") pod \"337e2278-00e7-428e-97c1-c8d940d83aa4\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.073177 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-combined-ca-bundle\") pod \"337e2278-00e7-428e-97c1-c8d940d83aa4\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.073322 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/337e2278-00e7-428e-97c1-c8d940d83aa4-etc-machine-id\") pod \"337e2278-00e7-428e-97c1-c8d940d83aa4\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.073502 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-config-data\") pod \"337e2278-00e7-428e-97c1-c8d940d83aa4\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.073687 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5rpc\" (UniqueName: \"kubernetes.io/projected/337e2278-00e7-428e-97c1-c8d940d83aa4-kube-api-access-f5rpc\") pod \"337e2278-00e7-428e-97c1-c8d940d83aa4\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.073802 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-db-sync-config-data\") pod \"337e2278-00e7-428e-97c1-c8d940d83aa4\" (UID: \"337e2278-00e7-428e-97c1-c8d940d83aa4\") " Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.074874 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/337e2278-00e7-428e-97c1-c8d940d83aa4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "337e2278-00e7-428e-97c1-c8d940d83aa4" (UID: "337e2278-00e7-428e-97c1-c8d940d83aa4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.078234 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a667b00c-fedc-470d-adb4-309d0a96a676-kube-api-access-dr9sc" (OuterVolumeSpecName: "kube-api-access-dr9sc") pod "a667b00c-fedc-470d-adb4-309d0a96a676" (UID: "a667b00c-fedc-470d-adb4-309d0a96a676"). InnerVolumeSpecName "kube-api-access-dr9sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.084532 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/337e2278-00e7-428e-97c1-c8d940d83aa4-kube-api-access-f5rpc" (OuterVolumeSpecName: "kube-api-access-f5rpc") pod "337e2278-00e7-428e-97c1-c8d940d83aa4" (UID: "337e2278-00e7-428e-97c1-c8d940d83aa4"). InnerVolumeSpecName "kube-api-access-f5rpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.108209 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "337e2278-00e7-428e-97c1-c8d940d83aa4" (UID: "337e2278-00e7-428e-97c1-c8d940d83aa4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.117673 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-scripts" (OuterVolumeSpecName: "scripts") pod "337e2278-00e7-428e-97c1-c8d940d83aa4" (UID: "337e2278-00e7-428e-97c1-c8d940d83aa4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.132089 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "337e2278-00e7-428e-97c1-c8d940d83aa4" (UID: "337e2278-00e7-428e-97c1-c8d940d83aa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.167099 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-config-data" (OuterVolumeSpecName: "config-data") pod "337e2278-00e7-428e-97c1-c8d940d83aa4" (UID: "337e2278-00e7-428e-97c1-c8d940d83aa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.177019 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.177046 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr9sc\" (UniqueName: \"kubernetes.io/projected/a667b00c-fedc-470d-adb4-309d0a96a676-kube-api-access-dr9sc\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.177057 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.177066 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.177075 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/337e2278-00e7-428e-97c1-c8d940d83aa4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.177083 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/337e2278-00e7-428e-97c1-c8d940d83aa4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.177091 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5rpc\" (UniqueName: \"kubernetes.io/projected/337e2278-00e7-428e-97c1-c8d940d83aa4-kube-api-access-f5rpc\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.385030 4764 generic.go:334] "Generic (PLEG): container finished" podID="082cc398-d04e-430c-8de4-b1bc757cd290" containerID="93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4" exitCode=143 Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.385091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86db6c8bf4-mghmx" event={"ID":"082cc398-d04e-430c-8de4-b1bc757cd290","Type":"ContainerDied","Data":"93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4"} Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.387358 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566992-6qwzt" event={"ID":"a667b00c-fedc-470d-adb4-309d0a96a676","Type":"ContainerDied","Data":"b150274151ee26a91ce2aa1126d061a5f947fdb476d5b0402c79bf7e2d7ec1ed"} Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.387392 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b150274151ee26a91ce2aa1126d061a5f947fdb476d5b0402c79bf7e2d7ec1ed" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.387453 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566992-6qwzt" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.389645 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6zj6m" Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.389976 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6zj6m" event={"ID":"337e2278-00e7-428e-97c1-c8d940d83aa4","Type":"ContainerDied","Data":"841981b37f93195b61bf534bbb9e30dd532f12aed21c616624b0545667018c43"} Mar 20 15:12:13 crc kubenswrapper[4764]: I0320 15:12:13.389994 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="841981b37f93195b61bf534bbb9e30dd532f12aed21c616624b0545667018c43" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.090437 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566986-jgn2s"] Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.124248 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566986-jgn2s"] Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.137716 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.200757 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvptj\" (UniqueName: \"kubernetes.io/projected/be95591c-8398-4ba2-aa65-784bc64cc1b3-kube-api-access-lvptj\") pod \"be95591c-8398-4ba2-aa65-784bc64cc1b3\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.201014 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-ovsdbserver-nb\") pod \"be95591c-8398-4ba2-aa65-784bc64cc1b3\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.201109 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-dns-swift-storage-0\") pod \"be95591c-8398-4ba2-aa65-784bc64cc1b3\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.201261 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-dns-svc\") pod \"be95591c-8398-4ba2-aa65-784bc64cc1b3\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.201391 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-config\") pod \"be95591c-8398-4ba2-aa65-784bc64cc1b3\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.201482 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-ovsdbserver-sb\") pod \"be95591c-8398-4ba2-aa65-784bc64cc1b3\" (UID: \"be95591c-8398-4ba2-aa65-784bc64cc1b3\") " Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.207475 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be95591c-8398-4ba2-aa65-784bc64cc1b3-kube-api-access-lvptj" (OuterVolumeSpecName: "kube-api-access-lvptj") pod "be95591c-8398-4ba2-aa65-784bc64cc1b3" (UID: "be95591c-8398-4ba2-aa65-784bc64cc1b3"). InnerVolumeSpecName "kube-api-access-lvptj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.282153 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "be95591c-8398-4ba2-aa65-784bc64cc1b3" (UID: "be95591c-8398-4ba2-aa65-784bc64cc1b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.293897 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "be95591c-8398-4ba2-aa65-784bc64cc1b3" (UID: "be95591c-8398-4ba2-aa65-784bc64cc1b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.293963 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-b6dbs"] Mar 20 15:12:14 crc kubenswrapper[4764]: E0320 15:12:14.294309 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337e2278-00e7-428e-97c1-c8d940d83aa4" containerName="cinder-db-sync" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.294321 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="337e2278-00e7-428e-97c1-c8d940d83aa4" containerName="cinder-db-sync" Mar 20 15:12:14 crc kubenswrapper[4764]: E0320 15:12:14.294337 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a667b00c-fedc-470d-adb4-309d0a96a676" containerName="oc" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.294343 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a667b00c-fedc-470d-adb4-309d0a96a676" containerName="oc" Mar 20 15:12:14 crc kubenswrapper[4764]: E0320 15:12:14.294355 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be95591c-8398-4ba2-aa65-784bc64cc1b3" containerName="dnsmasq-dns" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.294360 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be95591c-8398-4ba2-aa65-784bc64cc1b3" containerName="dnsmasq-dns" Mar 20 15:12:14 crc kubenswrapper[4764]: E0320 15:12:14.294397 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6552d6d1-0522-4c30-9eba-b90d59482ea0" containerName="barbican-api-log" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.294405 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6552d6d1-0522-4c30-9eba-b90d59482ea0" containerName="barbican-api-log" Mar 20 15:12:14 crc kubenswrapper[4764]: E0320 15:12:14.294421 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6552d6d1-0522-4c30-9eba-b90d59482ea0" containerName="barbican-api" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.294428 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6552d6d1-0522-4c30-9eba-b90d59482ea0" containerName="barbican-api" Mar 20 15:12:14 crc kubenswrapper[4764]: E0320 15:12:14.294437 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be95591c-8398-4ba2-aa65-784bc64cc1b3" containerName="init" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.294442 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be95591c-8398-4ba2-aa65-784bc64cc1b3" containerName="init" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.294597 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="337e2278-00e7-428e-97c1-c8d940d83aa4" containerName="cinder-db-sync" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.294612 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="be95591c-8398-4ba2-aa65-784bc64cc1b3" containerName="dnsmasq-dns" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.294622 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6552d6d1-0522-4c30-9eba-b90d59482ea0" containerName="barbican-api" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.294631 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6552d6d1-0522-4c30-9eba-b90d59482ea0" containerName="barbican-api-log" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.294642 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a667b00c-fedc-470d-adb4-309d0a96a676" containerName="oc" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.305956 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.305985 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.305994 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvptj\" (UniqueName: \"kubernetes.io/projected/be95591c-8398-4ba2-aa65-784bc64cc1b3-kube-api-access-lvptj\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.309611 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.318729 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.320171 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.322883 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.323054 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.323222 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hq7b8" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.323463 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.331026 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-b6dbs"] Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.349578 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.388761 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-config" (OuterVolumeSpecName: "config") pod "be95591c-8398-4ba2-aa65-784bc64cc1b3" (UID: "be95591c-8398-4ba2-aa65-784bc64cc1b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.405985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" event={"ID":"be95591c-8398-4ba2-aa65-784bc64cc1b3","Type":"ContainerDied","Data":"564fb25c13dab5e6a6f8d2f44e033c39735f8d5b662320f47b7af2de9f9b402d"} Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.406961 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407001 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407050 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-scripts\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407068 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-config\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407101 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407148 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407180 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407241 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgchs\" (UniqueName: \"kubernetes.io/projected/83217f94-75f8-4f9b-b9d1-4247c602cc26-kube-api-access-fgchs\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407362 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cce9cce-4528-46d0-bd82-1361e9c78d45-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407428 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-config-data\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407448 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpjrp\" (UniqueName: \"kubernetes.io/projected/5cce9cce-4528-46d0-bd82-1361e9c78d45-kube-api-access-zpjrp\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407477 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407531 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.407596 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-n6p2n" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.408041 4764 scope.go:117] "RemoveContainer" containerID="02e7741578177972eef3b20819a82b9e605a6acb90680547cb62400d919c277d" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.453991 4764 scope.go:117] "RemoveContainer" containerID="c844da4347390dad93cd84b1871b34563fdf8d02bc9da84dd749258aa59db671" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.479495 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "be95591c-8398-4ba2-aa65-784bc64cc1b3" (UID: "be95591c-8398-4ba2-aa65-784bc64cc1b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.498785 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "be95591c-8398-4ba2-aa65-784bc64cc1b3" (UID: "be95591c-8398-4ba2-aa65-784bc64cc1b3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514289 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514344 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514368 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514426 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-scripts\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514442 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-config\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514457 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514492 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514521 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514560 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgchs\" (UniqueName: \"kubernetes.io/projected/83217f94-75f8-4f9b-b9d1-4247c602cc26-kube-api-access-fgchs\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514576 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cce9cce-4528-46d0-bd82-1361e9c78d45-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514598 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-config-data\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514618 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpjrp\" (UniqueName: \"kubernetes.io/projected/5cce9cce-4528-46d0-bd82-1361e9c78d45-kube-api-access-zpjrp\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514681 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.514692 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be95591c-8398-4ba2-aa65-784bc64cc1b3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.515713 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.517217 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.517867 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-config\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.518624 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.519023 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cce9cce-4528-46d0-bd82-1361e9c78d45-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.519696 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.527227 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.531024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.531460 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-scripts\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.533898 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-config-data\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.561904 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.563529 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.572176 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.581295 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.616448 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-scripts\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.616537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.616555 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-config-data-custom\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.616579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14683a5c-cf9b-4309-a835-2dd8643c7be8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.616640 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s2fd\" (UniqueName: \"kubernetes.io/projected/14683a5c-cf9b-4309-a835-2dd8643c7be8-kube-api-access-7s2fd\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.616680 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-config-data\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.616842 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14683a5c-cf9b-4309-a835-2dd8643c7be8-logs\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.707629 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpjrp\" (UniqueName: \"kubernetes.io/projected/5cce9cce-4528-46d0-bd82-1361e9c78d45-kube-api-access-zpjrp\") pod \"cinder-scheduler-0\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.708087 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgchs\" (UniqueName: \"kubernetes.io/projected/83217f94-75f8-4f9b-b9d1-4247c602cc26-kube-api-access-fgchs\") pod \"dnsmasq-dns-5c9776ccc5-b6dbs\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.718961 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s2fd\" (UniqueName: \"kubernetes.io/projected/14683a5c-cf9b-4309-a835-2dd8643c7be8-kube-api-access-7s2fd\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.719030 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-config-data\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.719070 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14683a5c-cf9b-4309-a835-2dd8643c7be8-logs\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.719148 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-scripts\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.719208 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.719229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-config-data-custom\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.719259 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14683a5c-cf9b-4309-a835-2dd8643c7be8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.719365 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14683a5c-cf9b-4309-a835-2dd8643c7be8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.720852 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14683a5c-cf9b-4309-a835-2dd8643c7be8-logs\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.723607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.723875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-scripts\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.724273 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-config-data\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.726862 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-config-data-custom\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.741267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s2fd\" (UniqueName: \"kubernetes.io/projected/14683a5c-cf9b-4309-a835-2dd8643c7be8-kube-api-access-7s2fd\") pod \"cinder-api-0\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.769166 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-n6p2n"] Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.783275 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-n6p2n"] Mar 20 15:12:14 crc kubenswrapper[4764]: E0320 15:12:14.787016 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.906109 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.941244 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:14 crc kubenswrapper[4764]: I0320 15:12:14.964641 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 15:12:15 crc kubenswrapper[4764]: I0320 15:12:15.153167 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73bc41bc-cd24-486e-bca8-1bb7a329b304" path="/var/lib/kubelet/pods/73bc41bc-cd24-486e-bca8-1bb7a329b304/volumes" Mar 20 15:12:15 crc kubenswrapper[4764]: I0320 15:12:15.159777 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be95591c-8398-4ba2-aa65-784bc64cc1b3" path="/var/lib/kubelet/pods/be95591c-8398-4ba2-aa65-784bc64cc1b3/volumes" Mar 20 15:12:15 crc kubenswrapper[4764]: I0320 15:12:15.405059 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:12:15 crc kubenswrapper[4764]: W0320 15:12:15.407514 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14683a5c_cf9b_4309_a835_2dd8643c7be8.slice/crio-2854449a7ac50c9df5cd169c10d176f8da1bafc87a2df5d02872c471d39223d6 WatchSource:0}: Error finding container 2854449a7ac50c9df5cd169c10d176f8da1bafc87a2df5d02872c471d39223d6: Status 404 returned error can't find the container with id 2854449a7ac50c9df5cd169c10d176f8da1bafc87a2df5d02872c471d39223d6 Mar 20 15:12:15 crc kubenswrapper[4764]: I0320 15:12:15.461741 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2","Type":"ContainerStarted","Data":"a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc"} Mar 20 15:12:15 crc kubenswrapper[4764]: I0320 15:12:15.461907 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerName="ceilometer-notification-agent" containerID="cri-o://6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897" gracePeriod=30 Mar 20 15:12:15 crc kubenswrapper[4764]: I0320 15:12:15.461995 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:12:15 crc kubenswrapper[4764]: I0320 15:12:15.462332 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerName="proxy-httpd" containerID="cri-o://a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc" gracePeriod=30 Mar 20 15:12:15 crc kubenswrapper[4764]: I0320 15:12:15.462535 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerName="sg-core" containerID="cri-o://fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5" gracePeriod=30 Mar 20 15:12:15 crc kubenswrapper[4764]: I0320 15:12:15.586224 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-b6dbs"] Mar 20 15:12:15 crc kubenswrapper[4764]: W0320 15:12:15.597146 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83217f94_75f8_4f9b_b9d1_4247c602cc26.slice/crio-c9ece5a78905a13d97080b4c250ec4ba0694f65d72bff66f469f496e8b5954ee WatchSource:0}: Error finding container c9ece5a78905a13d97080b4c250ec4ba0694f65d72bff66f469f496e8b5954ee: Status 404 returned error can't find the container with id c9ece5a78905a13d97080b4c250ec4ba0694f65d72bff66f469f496e8b5954ee Mar 20 15:12:15 crc kubenswrapper[4764]: I0320 15:12:15.740022 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:12:15 crc kubenswrapper[4764]: W0320 15:12:15.749143 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cce9cce_4528_46d0_bd82_1361e9c78d45.slice/crio-ddb2e1624aca48fff050a645a9f8eb35987b96033737ac605c29612ecdf643b6 WatchSource:0}: Error finding container ddb2e1624aca48fff050a645a9f8eb35987b96033737ac605c29612ecdf643b6: Status 404 returned error can't find the container with id ddb2e1624aca48fff050a645a9f8eb35987b96033737ac605c29612ecdf643b6 Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.249521 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.343153 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.462937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-config-data\") pod \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.463285 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-log-httpd\") pod \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.463395 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-sg-core-conf-yaml\") pod \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.463430 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tghv4\" (UniqueName: \"kubernetes.io/projected/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-kube-api-access-tghv4\") pod \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.463486 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-scripts\") pod \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.463856 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-run-httpd\") pod \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.463910 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-combined-ca-bundle\") pod \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\" (UID: \"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2\") " Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.464213 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" (UID: "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.464254 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" (UID: "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.464467 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.464483 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.467699 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-kube-api-access-tghv4" (OuterVolumeSpecName: "kube-api-access-tghv4") pod "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" (UID: "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2"). InnerVolumeSpecName "kube-api-access-tghv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.478505 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-scripts" (OuterVolumeSpecName: "scripts") pod "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" (UID: "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.480203 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cce9cce-4528-46d0-bd82-1361e9c78d45","Type":"ContainerStarted","Data":"ddb2e1624aca48fff050a645a9f8eb35987b96033737ac605c29612ecdf643b6"} Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.490785 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" (UID: "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.494814 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"14683a5c-cf9b-4309-a835-2dd8643c7be8","Type":"ContainerStarted","Data":"0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3"} Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.494862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"14683a5c-cf9b-4309-a835-2dd8643c7be8","Type":"ContainerStarted","Data":"2854449a7ac50c9df5cd169c10d176f8da1bafc87a2df5d02872c471d39223d6"} Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.505348 4764 generic.go:334] "Generic (PLEG): container finished" podID="83217f94-75f8-4f9b-b9d1-4247c602cc26" containerID="d8e2b25ff1a56bf327c5296603ec2a96bc37d0185283e691c8da4656c7fa90f0" exitCode=0 Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.506039 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" event={"ID":"83217f94-75f8-4f9b-b9d1-4247c602cc26","Type":"ContainerDied","Data":"d8e2b25ff1a56bf327c5296603ec2a96bc37d0185283e691c8da4656c7fa90f0"} Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.506896 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" event={"ID":"83217f94-75f8-4f9b-b9d1-4247c602cc26","Type":"ContainerStarted","Data":"c9ece5a78905a13d97080b4c250ec4ba0694f65d72bff66f469f496e8b5954ee"} Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.512540 4764 generic.go:334] "Generic (PLEG): container finished" podID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerID="a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc" exitCode=0 Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.512566 4764 generic.go:334] "Generic (PLEG): container finished" podID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerID="fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5" exitCode=2 Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.512573 4764 generic.go:334] "Generic (PLEG): container finished" podID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerID="6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897" exitCode=0 Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.512595 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2","Type":"ContainerDied","Data":"a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc"} Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.512622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2","Type":"ContainerDied","Data":"fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5"} Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.512634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2","Type":"ContainerDied","Data":"6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897"} Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.512643 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2","Type":"ContainerDied","Data":"dd534bf1da85441e2afcdbebd3bee5022c399a2f3537f61e41f39888b21f7ac2"} Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.512659 4764 scope.go:117] "RemoveContainer" containerID="a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.512787 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.524004 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" (UID: "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.562681 4764 scope.go:117] "RemoveContainer" containerID="fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.566964 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.566994 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tghv4\" (UniqueName: \"kubernetes.io/projected/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-kube-api-access-tghv4\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.567008 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.567021 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.568633 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-config-data" (OuterVolumeSpecName: "config-data") pod "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" (UID: "1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.599586 4764 scope.go:117] "RemoveContainer" containerID="6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.640451 4764 scope.go:117] "RemoveContainer" containerID="a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc" Mar 20 15:12:16 crc kubenswrapper[4764]: E0320 15:12:16.641255 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc\": container with ID starting with a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc not found: ID does not exist" containerID="a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.641286 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc"} err="failed to get container status \"a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc\": rpc error: code = NotFound desc = could not find container \"a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc\": container with ID starting with a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc not found: ID does not exist" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.641306 4764 scope.go:117] "RemoveContainer" containerID="fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5" Mar 20 15:12:16 crc kubenswrapper[4764]: E0320 15:12:16.641608 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5\": container with ID starting with fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5 not found: ID does not exist" containerID="fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.641635 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5"} err="failed to get container status \"fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5\": rpc error: code = NotFound desc = could not find container \"fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5\": container with ID starting with fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5 not found: ID does not exist" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.641653 4764 scope.go:117] "RemoveContainer" containerID="6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897" Mar 20 15:12:16 crc kubenswrapper[4764]: E0320 15:12:16.641828 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897\": container with ID starting with 6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897 not found: ID does not exist" containerID="6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.641852 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897"} err="failed to get container status \"6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897\": rpc error: code = NotFound desc = could not find container \"6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897\": container with ID starting with 6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897 not found: ID does not exist" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.641865 4764 scope.go:117] "RemoveContainer" containerID="a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.642777 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc"} err="failed to get container status \"a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc\": rpc error: code = NotFound desc = could not find container \"a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc\": container with ID starting with a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc not found: ID does not exist" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.642807 4764 scope.go:117] "RemoveContainer" containerID="fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.643058 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5"} err="failed to get container status \"fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5\": rpc error: code = NotFound desc = could not find container \"fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5\": container with ID starting with fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5 not found: ID does not exist" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.643082 4764 scope.go:117] "RemoveContainer" containerID="6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.643256 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897"} err="failed to get container status \"6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897\": rpc error: code = NotFound desc = could not find container \"6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897\": container with ID starting with 6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897 not found: ID does not exist" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.643278 4764 scope.go:117] "RemoveContainer" containerID="a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.643457 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc"} err="failed to get container status \"a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc\": rpc error: code = NotFound desc = could not find container \"a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc\": container with ID starting with a5c4e6d8505a8289aa26ccf7c575a93e3da7c2d7cdd60e4172b05c2b9f0df8dc not found: ID does not exist" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.643481 4764 scope.go:117] "RemoveContainer" containerID="fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.643656 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5"} err="failed to get container status \"fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5\": rpc error: code = NotFound desc = could not find container \"fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5\": container with ID starting with fdb3732be5d7340a1c34c31813cd83c1dd00160d57a73bb2b98d9198ff0929a5 not found: ID does not exist" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.643680 4764 scope.go:117] "RemoveContainer" containerID="6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.643834 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897"} err="failed to get container status \"6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897\": rpc error: code = NotFound desc = could not find container \"6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897\": container with ID starting with 6757fc1be8f28a26af924df51856871259ea1df4b52656f623acb9a9cce09897 not found: ID does not exist" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.668462 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.881440 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.899131 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.909193 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:16 crc kubenswrapper[4764]: E0320 15:12:16.910981 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerName="sg-core" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.910997 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerName="sg-core" Mar 20 15:12:16 crc kubenswrapper[4764]: E0320 15:12:16.911022 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerName="ceilometer-notification-agent" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.911028 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerName="ceilometer-notification-agent" Mar 20 15:12:16 crc kubenswrapper[4764]: E0320 15:12:16.911047 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerName="proxy-httpd" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.911053 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerName="proxy-httpd" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.911218 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerName="sg-core" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.911232 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerName="proxy-httpd" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.911242 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" containerName="ceilometer-notification-agent" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.914565 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.920837 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.921014 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.921540 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.972810 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-config-data\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.973043 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.973146 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssjjd\" (UniqueName: \"kubernetes.io/projected/9bcaf93d-1c10-473b-9e49-641700b416d9-kube-api-access-ssjjd\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.973266 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bcaf93d-1c10-473b-9e49-641700b416d9-log-httpd\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.973341 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bcaf93d-1c10-473b-9e49-641700b416d9-run-httpd\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.973443 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:16 crc kubenswrapper[4764]: I0320 15:12:16.973518 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-scripts\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.037266 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.076591 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.076631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-scripts\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.076695 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-config-data\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.076743 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.076787 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssjjd\" (UniqueName: \"kubernetes.io/projected/9bcaf93d-1c10-473b-9e49-641700b416d9-kube-api-access-ssjjd\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.076847 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bcaf93d-1c10-473b-9e49-641700b416d9-log-httpd\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.076864 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bcaf93d-1c10-473b-9e49-641700b416d9-run-httpd\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.078290 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bcaf93d-1c10-473b-9e49-641700b416d9-log-httpd\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.078512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bcaf93d-1c10-473b-9e49-641700b416d9-run-httpd\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.082701 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.084885 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-scripts\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.091085 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.092726 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssjjd\" (UniqueName: \"kubernetes.io/projected/9bcaf93d-1c10-473b-9e49-641700b416d9-kube-api-access-ssjjd\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.093078 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-config-data\") pod \"ceilometer-0\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.139168 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2" path="/var/lib/kubelet/pods/1e7491f8-cdfe-4c7c-a1ea-dfca9bc7a1e2/volumes" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.259545 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.342321 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.531845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" event={"ID":"83217f94-75f8-4f9b-b9d1-4247c602cc26","Type":"ContainerStarted","Data":"59d10a14895dddc75eda73f47e084a6ca150ba609f6efc06eb77693aefa513a8"} Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.533165 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.539001 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cce9cce-4528-46d0-bd82-1361e9c78d45","Type":"ContainerStarted","Data":"7ce646b1a46a1739ee06da160f438d260722d50a75267964326ccb9104cfcc1d"} Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.541574 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"14683a5c-cf9b-4309-a835-2dd8643c7be8","Type":"ContainerStarted","Data":"5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5"} Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.541707 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="14683a5c-cf9b-4309-a835-2dd8643c7be8" containerName="cinder-api-log" containerID="cri-o://0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3" gracePeriod=30 Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.541796 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="14683a5c-cf9b-4309-a835-2dd8643c7be8" containerName="cinder-api" containerID="cri-o://5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5" gracePeriod=30 Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.541796 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.565453 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" podStartSLOduration=3.565435634 podStartE2EDuration="3.565435634s" podCreationTimestamp="2026-03-20 15:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:17.552703305 +0000 UTC m=+1259.168892434" watchObservedRunningTime="2026-03-20 15:12:17.565435634 +0000 UTC m=+1259.181624763" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.673750 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.673707151 podStartE2EDuration="3.673707151s" podCreationTimestamp="2026-03-20 15:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:17.57611572 +0000 UTC m=+1259.192304849" watchObservedRunningTime="2026-03-20 15:12:17.673707151 +0000 UTC m=+1259.289896280" Mar 20 15:12:17 crc kubenswrapper[4764]: I0320 15:12:17.803123 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:17 crc kubenswrapper[4764]: W0320 15:12:17.861028 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bcaf93d_1c10_473b_9e49_641700b416d9.slice/crio-61385bb6ab46ab4050da6c639b3176cf54acc08ec3954a3d1136e72a6378e8d3 WatchSource:0}: Error finding container 61385bb6ab46ab4050da6c639b3176cf54acc08ec3954a3d1136e72a6378e8d3: Status 404 returned error can't find the container with id 61385bb6ab46ab4050da6c639b3176cf54acc08ec3954a3d1136e72a6378e8d3 Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.506397 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.508888 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-86db6c8bf4-mghmx" podUID="082cc398-d04e-430c-8de4-b1bc757cd290" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:59888->10.217.0.166:9311: read: connection reset by peer" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.508938 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-86db6c8bf4-mghmx" podUID="082cc398-d04e-430c-8de4-b1bc757cd290" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:59878->10.217.0.166:9311: read: connection reset by peer" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.548093 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.600013 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14683a5c-cf9b-4309-a835-2dd8643c7be8-etc-machine-id\") pod \"14683a5c-cf9b-4309-a835-2dd8643c7be8\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.600556 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-combined-ca-bundle\") pod \"14683a5c-cf9b-4309-a835-2dd8643c7be8\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.600609 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-config-data\") pod \"14683a5c-cf9b-4309-a835-2dd8643c7be8\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.600641 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s2fd\" (UniqueName: \"kubernetes.io/projected/14683a5c-cf9b-4309-a835-2dd8643c7be8-kube-api-access-7s2fd\") pod \"14683a5c-cf9b-4309-a835-2dd8643c7be8\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.600666 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-scripts\") pod \"14683a5c-cf9b-4309-a835-2dd8643c7be8\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.600699 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14683a5c-cf9b-4309-a835-2dd8643c7be8-logs\") pod \"14683a5c-cf9b-4309-a835-2dd8643c7be8\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.600797 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-config-data-custom\") pod \"14683a5c-cf9b-4309-a835-2dd8643c7be8\" (UID: \"14683a5c-cf9b-4309-a835-2dd8643c7be8\") " Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.602101 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14683a5c-cf9b-4309-a835-2dd8643c7be8-logs" (OuterVolumeSpecName: "logs") pod "14683a5c-cf9b-4309-a835-2dd8643c7be8" (UID: "14683a5c-cf9b-4309-a835-2dd8643c7be8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.602196 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14683a5c-cf9b-4309-a835-2dd8643c7be8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "14683a5c-cf9b-4309-a835-2dd8643c7be8" (UID: "14683a5c-cf9b-4309-a835-2dd8643c7be8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.602285 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cce9cce-4528-46d0-bd82-1361e9c78d45","Type":"ContainerStarted","Data":"9c9501f8c7f08cbd781826710c0c7c7c8bddf533d84cd04493db0b989f7150bf"} Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.604194 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14683a5c-cf9b-4309-a835-2dd8643c7be8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.604216 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14683a5c-cf9b-4309-a835-2dd8643c7be8-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.605942 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "14683a5c-cf9b-4309-a835-2dd8643c7be8" (UID: "14683a5c-cf9b-4309-a835-2dd8643c7be8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.607776 4764 generic.go:334] "Generic (PLEG): container finished" podID="14683a5c-cf9b-4309-a835-2dd8643c7be8" containerID="5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5" exitCode=0 Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.607799 4764 generic.go:334] "Generic (PLEG): container finished" podID="14683a5c-cf9b-4309-a835-2dd8643c7be8" containerID="0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3" exitCode=143 Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.607857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"14683a5c-cf9b-4309-a835-2dd8643c7be8","Type":"ContainerDied","Data":"5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5"} Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.607880 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"14683a5c-cf9b-4309-a835-2dd8643c7be8","Type":"ContainerDied","Data":"0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3"} Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.607890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"14683a5c-cf9b-4309-a835-2dd8643c7be8","Type":"ContainerDied","Data":"2854449a7ac50c9df5cd169c10d176f8da1bafc87a2df5d02872c471d39223d6"} Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.607905 4764 scope.go:117] "RemoveContainer" containerID="5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.608080 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-scripts" (OuterVolumeSpecName: "scripts") pod "14683a5c-cf9b-4309-a835-2dd8643c7be8" (UID: "14683a5c-cf9b-4309-a835-2dd8643c7be8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.608170 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.612061 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14683a5c-cf9b-4309-a835-2dd8643c7be8-kube-api-access-7s2fd" (OuterVolumeSpecName: "kube-api-access-7s2fd") pod "14683a5c-cf9b-4309-a835-2dd8643c7be8" (UID: "14683a5c-cf9b-4309-a835-2dd8643c7be8"). InnerVolumeSpecName "kube-api-access-7s2fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.613963 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bcaf93d-1c10-473b-9e49-641700b416d9","Type":"ContainerStarted","Data":"61385bb6ab46ab4050da6c639b3176cf54acc08ec3954a3d1136e72a6378e8d3"} Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.633171 4764 scope.go:117] "RemoveContainer" containerID="0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.635363 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.74466172 podStartE2EDuration="4.63534218s" podCreationTimestamp="2026-03-20 15:12:14 +0000 UTC" firstStartedPulling="2026-03-20 15:12:15.751558228 +0000 UTC m=+1257.367747357" lastFinishedPulling="2026-03-20 15:12:16.642238688 +0000 UTC m=+1258.258427817" observedRunningTime="2026-03-20 15:12:18.631692038 +0000 UTC m=+1260.247881187" watchObservedRunningTime="2026-03-20 15:12:18.63534218 +0000 UTC m=+1260.251531309" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.643504 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14683a5c-cf9b-4309-a835-2dd8643c7be8" (UID: "14683a5c-cf9b-4309-a835-2dd8643c7be8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.696703 4764 scope.go:117] "RemoveContainer" containerID="5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5" Mar 20 15:12:18 crc kubenswrapper[4764]: E0320 15:12:18.697870 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5\": container with ID starting with 5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5 not found: ID does not exist" containerID="5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.697917 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5"} err="failed to get container status \"5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5\": rpc error: code = NotFound desc = could not find container \"5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5\": container with ID starting with 5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5 not found: ID does not exist" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.697942 4764 scope.go:117] "RemoveContainer" containerID="0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3" Mar 20 15:12:18 crc kubenswrapper[4764]: E0320 15:12:18.700910 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3\": container with ID starting with 0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3 not found: ID does not exist" containerID="0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.700958 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3"} err="failed to get container status \"0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3\": rpc error: code = NotFound desc = could not find container \"0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3\": container with ID starting with 0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3 not found: ID does not exist" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.700983 4764 scope.go:117] "RemoveContainer" containerID="5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.702900 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-config-data" (OuterVolumeSpecName: "config-data") pod "14683a5c-cf9b-4309-a835-2dd8643c7be8" (UID: "14683a5c-cf9b-4309-a835-2dd8643c7be8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.704184 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5"} err="failed to get container status \"5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5\": rpc error: code = NotFound desc = could not find container \"5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5\": container with ID starting with 5cad1fce7a0620ac883bd792e149b5304c0b52faa5d80b1d0d3477696dee17d5 not found: ID does not exist" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.704218 4764 scope.go:117] "RemoveContainer" containerID="0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.705880 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.705927 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.705940 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s2fd\" (UniqueName: \"kubernetes.io/projected/14683a5c-cf9b-4309-a835-2dd8643c7be8-kube-api-access-7s2fd\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.705967 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.705981 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14683a5c-cf9b-4309-a835-2dd8643c7be8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.707922 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3"} err="failed to get container status \"0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3\": rpc error: code = NotFound desc = could not find container \"0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3\": container with ID starting with 0c47e57e20a833ddd21c961d1e65f98947e0dc34ce37bcc157615e5d5334d0f3 not found: ID does not exist" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.835883 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fbdf79ddf-5llj7"] Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.836104 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-fbdf79ddf-5llj7" podUID="07deed84-d17d-4b5f-955d-7087ecbc782d" containerName="neutron-api" containerID="cri-o://ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847" gracePeriod=30 Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.836661 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-fbdf79ddf-5llj7" podUID="07deed84-d17d-4b5f-955d-7087ecbc782d" containerName="neutron-httpd" containerID="cri-o://cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41" gracePeriod=30 Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.854547 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-fbdf79ddf-5llj7" podUID="07deed84-d17d-4b5f-955d-7087ecbc782d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": read tcp 10.217.0.2:41564->10.217.0.156:9696: read: connection reset by peer" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.893279 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56ccbd6c69-4c89q"] Mar 20 15:12:18 crc kubenswrapper[4764]: E0320 15:12:18.893925 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14683a5c-cf9b-4309-a835-2dd8643c7be8" containerName="cinder-api" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.893936 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="14683a5c-cf9b-4309-a835-2dd8643c7be8" containerName="cinder-api" Mar 20 15:12:18 crc kubenswrapper[4764]: E0320 15:12:18.893949 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14683a5c-cf9b-4309-a835-2dd8643c7be8" containerName="cinder-api-log" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.893955 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="14683a5c-cf9b-4309-a835-2dd8643c7be8" containerName="cinder-api-log" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.894110 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="14683a5c-cf9b-4309-a835-2dd8643c7be8" containerName="cinder-api" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.894129 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="14683a5c-cf9b-4309-a835-2dd8643c7be8" containerName="cinder-api-log" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.895267 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.901111 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56ccbd6c69-4c89q"] Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.975899 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:12:18 crc kubenswrapper[4764]: I0320 15:12:18.995548 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.011918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnj6g\" (UniqueName: \"kubernetes.io/projected/33b46429-6eed-4b3c-8a29-39b923aad151-kube-api-access-lnj6g\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.011954 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-config\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.011989 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-internal-tls-certs\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.012006 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-public-tls-certs\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.012088 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-combined-ca-bundle\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.012144 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-httpd-config\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.012172 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-ovndb-tls-certs\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.030525 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.032006 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.033579 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.033661 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.036300 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.036446 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.112959 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113368 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-ovndb-tls-certs\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113529 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-config-data-custom\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnj6g\" (UniqueName: \"kubernetes.io/projected/33b46429-6eed-4b3c-8a29-39b923aad151-kube-api-access-lnj6g\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113599 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-config\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113616 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9btmx\" (UniqueName: \"kubernetes.io/projected/cadf6142-f417-45ee-9c6b-92378a298170-kube-api-access-9btmx\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113633 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-internal-tls-certs\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113652 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-public-tls-certs\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113678 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-scripts\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113700 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cadf6142-f417-45ee-9c6b-92378a298170-logs\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113718 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113750 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113780 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-combined-ca-bundle\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cadf6142-f417-45ee-9c6b-92378a298170-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113835 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-config-data\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.113891 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-httpd-config\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.120105 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-httpd-config\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.122585 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-internal-tls-certs\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.132355 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-combined-ca-bundle\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.134688 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-ovndb-tls-certs\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.142338 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnj6g\" (UniqueName: \"kubernetes.io/projected/33b46429-6eed-4b3c-8a29-39b923aad151-kube-api-access-lnj6g\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.167151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-public-tls-certs\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.196657 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/33b46429-6eed-4b3c-8a29-39b923aad151-config\") pod \"neutron-56ccbd6c69-4c89q\" (UID: \"33b46429-6eed-4b3c-8a29-39b923aad151\") " pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.208712 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14683a5c-cf9b-4309-a835-2dd8643c7be8" path="/var/lib/kubelet/pods/14683a5c-cf9b-4309-a835-2dd8643c7be8/volumes" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.214550 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-combined-ca-bundle\") pod \"082cc398-d04e-430c-8de4-b1bc757cd290\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.214589 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-config-data\") pod \"082cc398-d04e-430c-8de4-b1bc757cd290\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.214767 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-config-data-custom\") pod \"082cc398-d04e-430c-8de4-b1bc757cd290\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.214795 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf4qx\" (UniqueName: \"kubernetes.io/projected/082cc398-d04e-430c-8de4-b1bc757cd290-kube-api-access-jf4qx\") pod \"082cc398-d04e-430c-8de4-b1bc757cd290\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.214843 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/082cc398-d04e-430c-8de4-b1bc757cd290-logs\") pod \"082cc398-d04e-430c-8de4-b1bc757cd290\" (UID: \"082cc398-d04e-430c-8de4-b1bc757cd290\") " Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.215067 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-scripts\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.215101 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cadf6142-f417-45ee-9c6b-92378a298170-logs\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.215122 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.215155 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.215218 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cadf6142-f417-45ee-9c6b-92378a298170-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.215244 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.215283 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-config-data\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.215317 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-config-data-custom\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.215355 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9btmx\" (UniqueName: \"kubernetes.io/projected/cadf6142-f417-45ee-9c6b-92378a298170-kube-api-access-9btmx\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.217718 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/082cc398-d04e-430c-8de4-b1bc757cd290-logs" (OuterVolumeSpecName: "logs") pod "082cc398-d04e-430c-8de4-b1bc757cd290" (UID: "082cc398-d04e-430c-8de4-b1bc757cd290"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.218507 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cadf6142-f417-45ee-9c6b-92378a298170-logs\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.219240 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cadf6142-f417-45ee-9c6b-92378a298170-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.221753 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.221759 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.221926 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.222610 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-scripts\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.222869 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "082cc398-d04e-430c-8de4-b1bc757cd290" (UID: "082cc398-d04e-430c-8de4-b1bc757cd290"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.233166 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.233992 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.238799 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-config-data-custom\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.240301 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9btmx\" (UniqueName: \"kubernetes.io/projected/cadf6142-f417-45ee-9c6b-92378a298170-kube-api-access-9btmx\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.247313 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-config-data\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.247517 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.247962 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.256881 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "082cc398-d04e-430c-8de4-b1bc757cd290" (UID: "082cc398-d04e-430c-8de4-b1bc757cd290"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.268539 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/082cc398-d04e-430c-8de4-b1bc757cd290-kube-api-access-jf4qx" (OuterVolumeSpecName: "kube-api-access-jf4qx") pod "082cc398-d04e-430c-8de4-b1bc757cd290" (UID: "082cc398-d04e-430c-8de4-b1bc757cd290"). InnerVolumeSpecName "kube-api-access-jf4qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.269068 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadf6142-f417-45ee-9c6b-92378a298170-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cadf6142-f417-45ee-9c6b-92378a298170\") " pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.317893 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.318123 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.318132 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf4qx\" (UniqueName: \"kubernetes.io/projected/082cc398-d04e-430c-8de4-b1bc757cd290-kube-api-access-jf4qx\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.318142 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/082cc398-d04e-430c-8de4-b1bc757cd290-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.331462 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-config-data" (OuterVolumeSpecName: "config-data") pod "082cc398-d04e-430c-8de4-b1bc757cd290" (UID: "082cc398-d04e-430c-8de4-b1bc757cd290"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.421502 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/082cc398-d04e-430c-8de4-b1bc757cd290-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.532700 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.648370 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-655785589d-5cnb4" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.649064 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86db6c8bf4-mghmx" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.649082 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86db6c8bf4-mghmx" event={"ID":"082cc398-d04e-430c-8de4-b1bc757cd290","Type":"ContainerDied","Data":"c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643"} Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.649157 4764 scope.go:117] "RemoveContainer" containerID="c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.648950 4764 generic.go:334] "Generic (PLEG): container finished" podID="082cc398-d04e-430c-8de4-b1bc757cd290" containerID="c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643" exitCode=0 Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.661130 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86db6c8bf4-mghmx" event={"ID":"082cc398-d04e-430c-8de4-b1bc757cd290","Type":"ContainerDied","Data":"5087e4f79d02afc5c3a3c24d0d6f90f10eea76feb9b6c39c7018ac9c6782749c"} Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.678619 4764 generic.go:334] "Generic (PLEG): container finished" podID="07deed84-d17d-4b5f-955d-7087ecbc782d" containerID="cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41" exitCode=0 Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.678698 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbdf79ddf-5llj7" event={"ID":"07deed84-d17d-4b5f-955d-7087ecbc782d","Type":"ContainerDied","Data":"cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41"} Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.684184 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bcaf93d-1c10-473b-9e49-641700b416d9","Type":"ContainerStarted","Data":"4753aae83fa557447da001506741b8de664773d731ea8688f8f37831d3b3a923"} Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.728586 4764 scope.go:117] "RemoveContainer" containerID="93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.743747 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86db6c8bf4-mghmx"] Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.764428 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-86db6c8bf4-mghmx"] Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.778470 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66899c9d8-zh5gp"] Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.778812 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66899c9d8-zh5gp" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" containerName="horizon-log" containerID="cri-o://8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41" gracePeriod=30 Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.779264 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66899c9d8-zh5gp" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" containerName="horizon" containerID="cri-o://c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12" gracePeriod=30 Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.789616 4764 scope.go:117] "RemoveContainer" containerID="c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643" Mar 20 15:12:19 crc kubenswrapper[4764]: E0320 15:12:19.793224 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643\": container with ID starting with c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643 not found: ID does not exist" containerID="c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.793282 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643"} err="failed to get container status \"c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643\": rpc error: code = NotFound desc = could not find container \"c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643\": container with ID starting with c51563330939554c7c870b386f8d662c0958b3c1987669fd186f9096b12a3643 not found: ID does not exist" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.793311 4764 scope.go:117] "RemoveContainer" containerID="93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4" Mar 20 15:12:19 crc kubenswrapper[4764]: E0320 15:12:19.796562 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4\": container with ID starting with 93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4 not found: ID does not exist" containerID="93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.796594 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4"} err="failed to get container status \"93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4\": rpc error: code = NotFound desc = could not find container \"93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4\": container with ID starting with 93413dc32a6a1ecf31a58bd990f89db99c4059f8eaa2b0b71deddf1a7e7827a4 not found: ID does not exist" Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.855688 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56ccbd6c69-4c89q"] Mar 20 15:12:19 crc kubenswrapper[4764]: I0320 15:12:19.964882 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 15:12:20 crc kubenswrapper[4764]: I0320 15:12:20.013192 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 15:12:20 crc kubenswrapper[4764]: I0320 15:12:20.721886 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56ccbd6c69-4c89q" event={"ID":"33b46429-6eed-4b3c-8a29-39b923aad151","Type":"ContainerStarted","Data":"b30f4db03c53c0157b47a2ced02c35568660ae018a30e999e044e03f11466cea"} Mar 20 15:12:20 crc kubenswrapper[4764]: I0320 15:12:20.722187 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56ccbd6c69-4c89q" event={"ID":"33b46429-6eed-4b3c-8a29-39b923aad151","Type":"ContainerStarted","Data":"12b3d71d379d9ae5c6a50589731cb82293fd3bdfa75b787e0c56eadea15ed7e3"} Mar 20 15:12:20 crc kubenswrapper[4764]: I0320 15:12:20.722198 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56ccbd6c69-4c89q" event={"ID":"33b46429-6eed-4b3c-8a29-39b923aad151","Type":"ContainerStarted","Data":"f68c92eb8b209d5ec226f28e0c1f183f317ca972935bfd4d572ab4112dc0156d"} Mar 20 15:12:20 crc kubenswrapper[4764]: I0320 15:12:20.724061 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:20 crc kubenswrapper[4764]: I0320 15:12:20.727758 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bcaf93d-1c10-473b-9e49-641700b416d9","Type":"ContainerStarted","Data":"20007b377ef0b58fe74f1ff6da8740ea3a1952bc3d29ce710786ad6ece83d461"} Mar 20 15:12:20 crc kubenswrapper[4764]: I0320 15:12:20.730110 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cadf6142-f417-45ee-9c6b-92378a298170","Type":"ContainerStarted","Data":"24acb1b3efa1225f538544b1875fec9a2ca0d9ff0c3c2fc1e883dd18039e2de6"} Mar 20 15:12:20 crc kubenswrapper[4764]: I0320 15:12:20.730170 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cadf6142-f417-45ee-9c6b-92378a298170","Type":"ContainerStarted","Data":"1a68e9e91adf9389298a697e6b8eb1278129770f1e06be44ca62c7667a7eab0d"} Mar 20 15:12:20 crc kubenswrapper[4764]: I0320 15:12:20.752532 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56ccbd6c69-4c89q" podStartSLOduration=2.752516839 podStartE2EDuration="2.752516839s" podCreationTimestamp="2026-03-20 15:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:20.748451215 +0000 UTC m=+1262.364640344" watchObservedRunningTime="2026-03-20 15:12:20.752516839 +0000 UTC m=+1262.368705968" Mar 20 15:12:21 crc kubenswrapper[4764]: I0320 15:12:21.139982 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="082cc398-d04e-430c-8de4-b1bc757cd290" path="/var/lib/kubelet/pods/082cc398-d04e-430c-8de4-b1bc757cd290/volumes" Mar 20 15:12:21 crc kubenswrapper[4764]: I0320 15:12:21.739255 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bcaf93d-1c10-473b-9e49-641700b416d9","Type":"ContainerStarted","Data":"abd5385a4d46cdbb7f4c861a18ad57c1f6dfe1d8e546ff46a0ad00a7e2e062ba"} Mar 20 15:12:21 crc kubenswrapper[4764]: I0320 15:12:21.741326 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cadf6142-f417-45ee-9c6b-92378a298170","Type":"ContainerStarted","Data":"8e9e4d27592ffb77874e154eceb09649ff40f89f7077e883cdb245cf6e16f6f1"} Mar 20 15:12:21 crc kubenswrapper[4764]: I0320 15:12:21.741428 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 15:12:21 crc kubenswrapper[4764]: I0320 15:12:21.758525 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.758508472 podStartE2EDuration="3.758508472s" podCreationTimestamp="2026-03-20 15:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:21.755608444 +0000 UTC m=+1263.371797583" watchObservedRunningTime="2026-03-20 15:12:21.758508472 +0000 UTC m=+1263.374697601" Mar 20 15:12:22 crc kubenswrapper[4764]: I0320 15:12:22.496501 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-fbdf79ddf-5llj7" podUID="07deed84-d17d-4b5f-955d-7087ecbc782d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9696/\": dial tcp 10.217.0.156:9696: connect: connection refused" Mar 20 15:12:23 crc kubenswrapper[4764]: I0320 15:12:23.758706 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bcaf93d-1c10-473b-9e49-641700b416d9","Type":"ContainerStarted","Data":"a35adaac14cf8966b239cd84c8698aa9db383b666dff5a4386d8334e2c741da8"} Mar 20 15:12:23 crc kubenswrapper[4764]: I0320 15:12:23.759167 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:12:23 crc kubenswrapper[4764]: I0320 15:12:23.760731 4764 generic.go:334] "Generic (PLEG): container finished" podID="e532b989-f73c-49a1-b4f2-43322246a71e" containerID="c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12" exitCode=0 Mar 20 15:12:23 crc kubenswrapper[4764]: I0320 15:12:23.760752 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66899c9d8-zh5gp" event={"ID":"e532b989-f73c-49a1-b4f2-43322246a71e","Type":"ContainerDied","Data":"c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12"} Mar 20 15:12:23 crc kubenswrapper[4764]: I0320 15:12:23.801201 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.116820713 podStartE2EDuration="7.801179886s" podCreationTimestamp="2026-03-20 15:12:16 +0000 UTC" firstStartedPulling="2026-03-20 15:12:17.869140569 +0000 UTC m=+1259.485329698" lastFinishedPulling="2026-03-20 15:12:22.553499742 +0000 UTC m=+1264.169688871" observedRunningTime="2026-03-20 15:12:23.787052904 +0000 UTC m=+1265.403242043" watchObservedRunningTime="2026-03-20 15:12:23.801179886 +0000 UTC m=+1265.417369015" Mar 20 15:12:24 crc kubenswrapper[4764]: I0320 15:12:24.752759 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66899c9d8-zh5gp" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 20 15:12:24 crc kubenswrapper[4764]: I0320 15:12:24.943045 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.011029 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-n5xlv"] Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.011446 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" podUID="df6da429-0f30-4521-b52e-304ca4830075" containerName="dnsmasq-dns" containerID="cri-o://1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07" gracePeriod=10 Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.614119 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.635497 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.674761 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.779148 4764 generic.go:334] "Generic (PLEG): container finished" podID="df6da429-0f30-4521-b52e-304ca4830075" containerID="1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07" exitCode=0 Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.779881 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5cce9cce-4528-46d0-bd82-1361e9c78d45" containerName="cinder-scheduler" containerID="cri-o://7ce646b1a46a1739ee06da160f438d260722d50a75267964326ccb9104cfcc1d" gracePeriod=30 Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.779719 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.780834 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-config\") pod \"df6da429-0f30-4521-b52e-304ca4830075\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.780934 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-ovsdbserver-sb\") pod \"df6da429-0f30-4521-b52e-304ca4830075\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.781090 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-dns-svc\") pod \"df6da429-0f30-4521-b52e-304ca4830075\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.781167 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-ovsdbserver-nb\") pod \"df6da429-0f30-4521-b52e-304ca4830075\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.781229 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c84b2\" (UniqueName: \"kubernetes.io/projected/df6da429-0f30-4521-b52e-304ca4830075-kube-api-access-c84b2\") pod \"df6da429-0f30-4521-b52e-304ca4830075\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.781264 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-dns-swift-storage-0\") pod \"df6da429-0f30-4521-b52e-304ca4830075\" (UID: \"df6da429-0f30-4521-b52e-304ca4830075\") " Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.779666 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" event={"ID":"df6da429-0f30-4521-b52e-304ca4830075","Type":"ContainerDied","Data":"1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07"} Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.782021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-n5xlv" event={"ID":"df6da429-0f30-4521-b52e-304ca4830075","Type":"ContainerDied","Data":"e2fddaa6816ed99043d5ab82702eb3e26bb70bae66bc978e1d6f532475e23fd4"} Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.782090 4764 scope.go:117] "RemoveContainer" containerID="1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.780038 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5cce9cce-4528-46d0-bd82-1361e9c78d45" containerName="probe" containerID="cri-o://9c9501f8c7f08cbd781826710c0c7c7c8bddf533d84cd04493db0b989f7150bf" gracePeriod=30 Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.789116 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6da429-0f30-4521-b52e-304ca4830075-kube-api-access-c84b2" (OuterVolumeSpecName: "kube-api-access-c84b2") pod "df6da429-0f30-4521-b52e-304ca4830075" (UID: "df6da429-0f30-4521-b52e-304ca4830075"). InnerVolumeSpecName "kube-api-access-c84b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.842875 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df6da429-0f30-4521-b52e-304ca4830075" (UID: "df6da429-0f30-4521-b52e-304ca4830075"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.847830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df6da429-0f30-4521-b52e-304ca4830075" (UID: "df6da429-0f30-4521-b52e-304ca4830075"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.849718 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-config" (OuterVolumeSpecName: "config") pod "df6da429-0f30-4521-b52e-304ca4830075" (UID: "df6da429-0f30-4521-b52e-304ca4830075"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.852588 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df6da429-0f30-4521-b52e-304ca4830075" (UID: "df6da429-0f30-4521-b52e-304ca4830075"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.884486 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.884520 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c84b2\" (UniqueName: \"kubernetes.io/projected/df6da429-0f30-4521-b52e-304ca4830075-kube-api-access-c84b2\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.884534 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.884542 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.884550 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.887078 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df6da429-0f30-4521-b52e-304ca4830075" (UID: "df6da429-0f30-4521-b52e-304ca4830075"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.927821 4764 scope.go:117] "RemoveContainer" containerID="d4754e0959d3d674517bec732f88711ca67537308508737549020ef0054a280a" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.954437 4764 scope.go:117] "RemoveContainer" containerID="1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07" Mar 20 15:12:25 crc kubenswrapper[4764]: E0320 15:12:25.955174 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07\": container with ID starting with 1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07 not found: ID does not exist" containerID="1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.955277 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07"} err="failed to get container status \"1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07\": rpc error: code = NotFound desc = could not find container \"1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07\": container with ID starting with 1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07 not found: ID does not exist" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.955358 4764 scope.go:117] "RemoveContainer" containerID="d4754e0959d3d674517bec732f88711ca67537308508737549020ef0054a280a" Mar 20 15:12:25 crc kubenswrapper[4764]: E0320 15:12:25.955935 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4754e0959d3d674517bec732f88711ca67537308508737549020ef0054a280a\": container with ID starting with d4754e0959d3d674517bec732f88711ca67537308508737549020ef0054a280a not found: ID does not exist" containerID="d4754e0959d3d674517bec732f88711ca67537308508737549020ef0054a280a" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.955964 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4754e0959d3d674517bec732f88711ca67537308508737549020ef0054a280a"} err="failed to get container status \"d4754e0959d3d674517bec732f88711ca67537308508737549020ef0054a280a\": rpc error: code = NotFound desc = could not find container \"d4754e0959d3d674517bec732f88711ca67537308508737549020ef0054a280a\": container with ID starting with d4754e0959d3d674517bec732f88711ca67537308508737549020ef0054a280a not found: ID does not exist" Mar 20 15:12:25 crc kubenswrapper[4764]: I0320 15:12:25.986608 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df6da429-0f30-4521-b52e-304ca4830075-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:26 crc kubenswrapper[4764]: I0320 15:12:26.108426 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-n5xlv"] Mar 20 15:12:26 crc kubenswrapper[4764]: I0320 15:12:26.115236 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-n5xlv"] Mar 20 15:12:26 crc kubenswrapper[4764]: I0320 15:12:26.789364 4764 generic.go:334] "Generic (PLEG): container finished" podID="5cce9cce-4528-46d0-bd82-1361e9c78d45" containerID="9c9501f8c7f08cbd781826710c0c7c7c8bddf533d84cd04493db0b989f7150bf" exitCode=0 Mar 20 15:12:26 crc kubenswrapper[4764]: I0320 15:12:26.789648 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cce9cce-4528-46d0-bd82-1361e9c78d45","Type":"ContainerDied","Data":"9c9501f8c7f08cbd781826710c0c7c7c8bddf533d84cd04493db0b989f7150bf"} Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.138138 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df6da429-0f30-4521-b52e-304ca4830075" path="/var/lib/kubelet/pods/df6da429-0f30-4521-b52e-304ca4830075/volumes" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.244195 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.344686 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.598088 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-64c4774c48-rhcrn"] Mar 20 15:12:27 crc kubenswrapper[4764]: E0320 15:12:27.598439 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082cc398-d04e-430c-8de4-b1bc757cd290" containerName="barbican-api-log" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.598455 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="082cc398-d04e-430c-8de4-b1bc757cd290" containerName="barbican-api-log" Mar 20 15:12:27 crc kubenswrapper[4764]: E0320 15:12:27.598473 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6da429-0f30-4521-b52e-304ca4830075" containerName="dnsmasq-dns" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.598482 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6da429-0f30-4521-b52e-304ca4830075" containerName="dnsmasq-dns" Mar 20 15:12:27 crc kubenswrapper[4764]: E0320 15:12:27.598492 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6da429-0f30-4521-b52e-304ca4830075" containerName="init" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.598498 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6da429-0f30-4521-b52e-304ca4830075" containerName="init" Mar 20 15:12:27 crc kubenswrapper[4764]: E0320 15:12:27.598525 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="082cc398-d04e-430c-8de4-b1bc757cd290" containerName="barbican-api" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.598530 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="082cc398-d04e-430c-8de4-b1bc757cd290" containerName="barbican-api" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.598679 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="082cc398-d04e-430c-8de4-b1bc757cd290" containerName="barbican-api" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.598692 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6da429-0f30-4521-b52e-304ca4830075" containerName="dnsmasq-dns" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.598711 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="082cc398-d04e-430c-8de4-b1bc757cd290" containerName="barbican-api-log" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.599560 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.618937 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64c4774c48-rhcrn"] Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.718564 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnr8c\" (UniqueName: \"kubernetes.io/projected/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-kube-api-access-wnr8c\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.718906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-scripts\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.718979 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-logs\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.719370 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-config-data\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.719498 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-internal-tls-certs\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.719636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-combined-ca-bundle\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.720035 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-public-tls-certs\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.821522 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-config-data\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.821618 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-internal-tls-certs\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.821685 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-combined-ca-bundle\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.821812 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-public-tls-certs\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.821923 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnr8c\" (UniqueName: \"kubernetes.io/projected/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-kube-api-access-wnr8c\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.821970 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-scripts\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.822047 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-logs\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.822597 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-logs\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.826959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-public-tls-certs\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.829537 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-scripts\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.829986 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-config-data\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.829657 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-internal-tls-certs\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.842592 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-combined-ca-bundle\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.842870 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnr8c\" (UniqueName: \"kubernetes.io/projected/2f95b4a1-bd62-4e6a-8968-2de23aa0f532-kube-api-access-wnr8c\") pod \"placement-64c4774c48-rhcrn\" (UID: \"2f95b4a1-bd62-4e6a-8968-2de23aa0f532\") " pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:27 crc kubenswrapper[4764]: I0320 15:12:27.913750 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:28 crc kubenswrapper[4764]: I0320 15:12:28.278691 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64c4774c48-rhcrn"] Mar 20 15:12:28 crc kubenswrapper[4764]: I0320 15:12:28.557565 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7c69f8c7f-rlffj" Mar 20 15:12:28 crc kubenswrapper[4764]: I0320 15:12:28.810111 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64c4774c48-rhcrn" event={"ID":"2f95b4a1-bd62-4e6a-8968-2de23aa0f532","Type":"ContainerStarted","Data":"1ed5effab84fea81b65a19c6e5adc131273be3dd43abb02bd9691af0643e7f1a"} Mar 20 15:12:28 crc kubenswrapper[4764]: I0320 15:12:28.810151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64c4774c48-rhcrn" event={"ID":"2f95b4a1-bd62-4e6a-8968-2de23aa0f532","Type":"ContainerStarted","Data":"631a42c06716319b8e11938bfff8147256c918ad7d496350d657da2abdc84dd9"} Mar 20 15:12:28 crc kubenswrapper[4764]: I0320 15:12:28.810161 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64c4774c48-rhcrn" event={"ID":"2f95b4a1-bd62-4e6a-8968-2de23aa0f532","Type":"ContainerStarted","Data":"8170cb3cf71330b03a6a0483ad413870757244fb60f1b9f1e25a9b065f0cb662"} Mar 20 15:12:28 crc kubenswrapper[4764]: I0320 15:12:28.810791 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:28 crc kubenswrapper[4764]: I0320 15:12:28.810818 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.269023 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.297910 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-64c4774c48-rhcrn" podStartSLOduration=2.297894997 podStartE2EDuration="2.297894997s" podCreationTimestamp="2026-03-20 15:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:28.843068816 +0000 UTC m=+1270.459257945" watchObservedRunningTime="2026-03-20 15:12:29.297894997 +0000 UTC m=+1270.914084126" Mar 20 15:12:29 crc kubenswrapper[4764]: E0320 15:12:29.318023 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode532b989_f73c_49a1_b4f2_43322246a71e.slice/crio-conmon-c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf6da429_0f30_4521_b52e_304ca4830075.slice/crio-e2fddaa6816ed99043d5ab82702eb3e26bb70bae66bc978e1d6f532475e23fd4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode532b989_f73c_49a1_b4f2_43322246a71e.slice/crio-c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07deed84_d17d_4b5f_955d_7087ecbc782d.slice/crio-conmon-ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07deed84_d17d_4b5f_955d_7087ecbc782d.slice/crio-ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cce9cce_4528_46d0_bd82_1361e9c78d45.slice/crio-7ce646b1a46a1739ee06da160f438d260722d50a75267964326ccb9104cfcc1d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf6da429_0f30_4521_b52e_304ca4830075.slice/crio-1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cce9cce_4528_46d0_bd82_1361e9c78d45.slice/crio-9c9501f8c7f08cbd781826710c0c7c7c8bddf533d84cd04493db0b989f7150bf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf6da429_0f30_4521_b52e_304ca4830075.slice/crio-conmon-1e230845e7ca81746a48eacb5841cbd16f023c2770ef1ca2ef863e25c2151b07.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf6da429_0f30_4521_b52e_304ca4830075.slice\": RecentStats: unable to find data in memory cache]" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.359747 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-config\") pod \"07deed84-d17d-4b5f-955d-7087ecbc782d\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.360216 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-combined-ca-bundle\") pod \"07deed84-d17d-4b5f-955d-7087ecbc782d\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.360260 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-public-tls-certs\") pod \"07deed84-d17d-4b5f-955d-7087ecbc782d\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.360451 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-httpd-config\") pod \"07deed84-d17d-4b5f-955d-7087ecbc782d\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.360512 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzncg\" (UniqueName: \"kubernetes.io/projected/07deed84-d17d-4b5f-955d-7087ecbc782d-kube-api-access-zzncg\") pod \"07deed84-d17d-4b5f-955d-7087ecbc782d\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.360534 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-internal-tls-certs\") pod \"07deed84-d17d-4b5f-955d-7087ecbc782d\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.360558 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-ovndb-tls-certs\") pod \"07deed84-d17d-4b5f-955d-7087ecbc782d\" (UID: \"07deed84-d17d-4b5f-955d-7087ecbc782d\") " Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.366407 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "07deed84-d17d-4b5f-955d-7087ecbc782d" (UID: "07deed84-d17d-4b5f-955d-7087ecbc782d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.368577 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07deed84-d17d-4b5f-955d-7087ecbc782d-kube-api-access-zzncg" (OuterVolumeSpecName: "kube-api-access-zzncg") pod "07deed84-d17d-4b5f-955d-7087ecbc782d" (UID: "07deed84-d17d-4b5f-955d-7087ecbc782d"). InnerVolumeSpecName "kube-api-access-zzncg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.416445 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "07deed84-d17d-4b5f-955d-7087ecbc782d" (UID: "07deed84-d17d-4b5f-955d-7087ecbc782d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.417212 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-config" (OuterVolumeSpecName: "config") pod "07deed84-d17d-4b5f-955d-7087ecbc782d" (UID: "07deed84-d17d-4b5f-955d-7087ecbc782d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.419691 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07deed84-d17d-4b5f-955d-7087ecbc782d" (UID: "07deed84-d17d-4b5f-955d-7087ecbc782d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.438592 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "07deed84-d17d-4b5f-955d-7087ecbc782d" (UID: "07deed84-d17d-4b5f-955d-7087ecbc782d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.458043 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "07deed84-d17d-4b5f-955d-7087ecbc782d" (UID: "07deed84-d17d-4b5f-955d-7087ecbc782d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.463771 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.463898 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzncg\" (UniqueName: \"kubernetes.io/projected/07deed84-d17d-4b5f-955d-7087ecbc782d-kube-api-access-zzncg\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.463985 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.464042 4764 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.464094 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.464144 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.464196 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07deed84-d17d-4b5f-955d-7087ecbc782d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.826905 4764 generic.go:334] "Generic (PLEG): container finished" podID="07deed84-d17d-4b5f-955d-7087ecbc782d" containerID="ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847" exitCode=0 Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.826984 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbdf79ddf-5llj7" event={"ID":"07deed84-d17d-4b5f-955d-7087ecbc782d","Type":"ContainerDied","Data":"ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847"} Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.827034 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbdf79ddf-5llj7" event={"ID":"07deed84-d17d-4b5f-955d-7087ecbc782d","Type":"ContainerDied","Data":"8498500cf242d708c9226c9da6acbc01edab34bce7f4b1a0a7682557e14f851e"} Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.827052 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbdf79ddf-5llj7" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.827060 4764 scope.go:117] "RemoveContainer" containerID="cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.832684 4764 generic.go:334] "Generic (PLEG): container finished" podID="5cce9cce-4528-46d0-bd82-1361e9c78d45" containerID="7ce646b1a46a1739ee06da160f438d260722d50a75267964326ccb9104cfcc1d" exitCode=0 Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.834201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cce9cce-4528-46d0-bd82-1361e9c78d45","Type":"ContainerDied","Data":"7ce646b1a46a1739ee06da160f438d260722d50a75267964326ccb9104cfcc1d"} Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.965136 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.986712 4764 scope.go:117] "RemoveContainer" containerID="ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847" Mar 20 15:12:29 crc kubenswrapper[4764]: I0320 15:12:29.993649 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fbdf79ddf-5llj7"] Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.005158 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fbdf79ddf-5llj7"] Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.020371 4764 scope.go:117] "RemoveContainer" containerID="cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41" Mar 20 15:12:30 crc kubenswrapper[4764]: E0320 15:12:30.020866 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41\": container with ID starting with cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41 not found: ID does not exist" containerID="cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.020902 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41"} err="failed to get container status \"cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41\": rpc error: code = NotFound desc = could not find container \"cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41\": container with ID starting with cd8fff0d4cb23314d8eee28de5de0957f49b31db0971d2473cb1a9bd3c0acd41 not found: ID does not exist" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.020921 4764 scope.go:117] "RemoveContainer" containerID="ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847" Mar 20 15:12:30 crc kubenswrapper[4764]: E0320 15:12:30.021237 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847\": container with ID starting with ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847 not found: ID does not exist" containerID="ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.021258 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847"} err="failed to get container status \"ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847\": rpc error: code = NotFound desc = could not find container \"ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847\": container with ID starting with ea71f05a611d59118631831ac5318998a0b770222d66c98b34bc222a150bc847 not found: ID does not exist" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.082911 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-config-data-custom\") pod \"5cce9cce-4528-46d0-bd82-1361e9c78d45\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.082972 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-config-data\") pod \"5cce9cce-4528-46d0-bd82-1361e9c78d45\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.083009 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cce9cce-4528-46d0-bd82-1361e9c78d45-etc-machine-id\") pod \"5cce9cce-4528-46d0-bd82-1361e9c78d45\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.083075 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-combined-ca-bundle\") pod \"5cce9cce-4528-46d0-bd82-1361e9c78d45\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.083130 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpjrp\" (UniqueName: \"kubernetes.io/projected/5cce9cce-4528-46d0-bd82-1361e9c78d45-kube-api-access-zpjrp\") pod \"5cce9cce-4528-46d0-bd82-1361e9c78d45\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.083268 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-scripts\") pod \"5cce9cce-4528-46d0-bd82-1361e9c78d45\" (UID: \"5cce9cce-4528-46d0-bd82-1361e9c78d45\") " Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.083359 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5cce9cce-4528-46d0-bd82-1361e9c78d45-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5cce9cce-4528-46d0-bd82-1361e9c78d45" (UID: "5cce9cce-4528-46d0-bd82-1361e9c78d45"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.083623 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5cce9cce-4528-46d0-bd82-1361e9c78d45-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.128281 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cce9cce-4528-46d0-bd82-1361e9c78d45-kube-api-access-zpjrp" (OuterVolumeSpecName: "kube-api-access-zpjrp") pod "5cce9cce-4528-46d0-bd82-1361e9c78d45" (UID: "5cce9cce-4528-46d0-bd82-1361e9c78d45"). InnerVolumeSpecName "kube-api-access-zpjrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.128459 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-scripts" (OuterVolumeSpecName: "scripts") pod "5cce9cce-4528-46d0-bd82-1361e9c78d45" (UID: "5cce9cce-4528-46d0-bd82-1361e9c78d45"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.129038 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5cce9cce-4528-46d0-bd82-1361e9c78d45" (UID: "5cce9cce-4528-46d0-bd82-1361e9c78d45"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.186790 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.186819 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.186831 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpjrp\" (UniqueName: \"kubernetes.io/projected/5cce9cce-4528-46d0-bd82-1361e9c78d45-kube-api-access-zpjrp\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.210336 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cce9cce-4528-46d0-bd82-1361e9c78d45" (UID: "5cce9cce-4528-46d0-bd82-1361e9c78d45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.288478 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.290021 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-config-data" (OuterVolumeSpecName: "config-data") pod "5cce9cce-4528-46d0-bd82-1361e9c78d45" (UID: "5cce9cce-4528-46d0-bd82-1361e9c78d45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.390270 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cce9cce-4528-46d0-bd82-1361e9c78d45-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.850874 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5cce9cce-4528-46d0-bd82-1361e9c78d45","Type":"ContainerDied","Data":"ddb2e1624aca48fff050a645a9f8eb35987b96033737ac605c29612ecdf643b6"} Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.851160 4764 scope.go:117] "RemoveContainer" containerID="9c9501f8c7f08cbd781826710c0c7c7c8bddf533d84cd04493db0b989f7150bf" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.851263 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.914481 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.934596 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.945269 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:12:30 crc kubenswrapper[4764]: E0320 15:12:30.945652 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cce9cce-4528-46d0-bd82-1361e9c78d45" containerName="probe" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.945666 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cce9cce-4528-46d0-bd82-1361e9c78d45" containerName="probe" Mar 20 15:12:30 crc kubenswrapper[4764]: E0320 15:12:30.945708 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cce9cce-4528-46d0-bd82-1361e9c78d45" containerName="cinder-scheduler" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.945715 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cce9cce-4528-46d0-bd82-1361e9c78d45" containerName="cinder-scheduler" Mar 20 15:12:30 crc kubenswrapper[4764]: E0320 15:12:30.945725 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07deed84-d17d-4b5f-955d-7087ecbc782d" containerName="neutron-api" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.945732 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07deed84-d17d-4b5f-955d-7087ecbc782d" containerName="neutron-api" Mar 20 15:12:30 crc kubenswrapper[4764]: E0320 15:12:30.945744 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07deed84-d17d-4b5f-955d-7087ecbc782d" containerName="neutron-httpd" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.945750 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07deed84-d17d-4b5f-955d-7087ecbc782d" containerName="neutron-httpd" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.945926 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="07deed84-d17d-4b5f-955d-7087ecbc782d" containerName="neutron-api" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.945946 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cce9cce-4528-46d0-bd82-1361e9c78d45" containerName="cinder-scheduler" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.945961 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="07deed84-d17d-4b5f-955d-7087ecbc782d" containerName="neutron-httpd" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.945973 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cce9cce-4528-46d0-bd82-1361e9c78d45" containerName="probe" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.946946 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.951984 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 15:12:30 crc kubenswrapper[4764]: I0320 15:12:30.952263 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.022321 4764 scope.go:117] "RemoveContainer" containerID="7ce646b1a46a1739ee06da160f438d260722d50a75267964326ccb9104cfcc1d" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.104073 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-scripts\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.104111 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.104189 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9qhm\" (UniqueName: \"kubernetes.io/projected/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-kube-api-access-r9qhm\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.104208 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.104238 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-config-data\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.104253 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.134528 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07deed84-d17d-4b5f-955d-7087ecbc782d" path="/var/lib/kubelet/pods/07deed84-d17d-4b5f-955d-7087ecbc782d/volumes" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.135499 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cce9cce-4528-46d0-bd82-1361e9c78d45" path="/var/lib/kubelet/pods/5cce9cce-4528-46d0-bd82-1361e9c78d45/volumes" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.149552 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.150875 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.152703 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-r9l9q" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.152713 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.153074 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.175020 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.206364 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-scripts\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.206432 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.206539 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9qhm\" (UniqueName: \"kubernetes.io/projected/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-kube-api-access-r9qhm\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.206567 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.206614 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-config-data\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.206636 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.206727 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.211933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-config-data\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.217725 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.224406 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-scripts\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.226774 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9qhm\" (UniqueName: \"kubernetes.io/projected/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-kube-api-access-r9qhm\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.227849 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a766ed7e-8243-41e5-b3b1-bc3bdd0e069f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f\") " pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.263026 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.308319 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be96e3e8-4879-473f-b5e2-34af484ddfcc-openstack-config-secret\") pod \"openstackclient\" (UID: \"be96e3e8-4879-473f-b5e2-34af484ddfcc\") " pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.308413 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be96e3e8-4879-473f-b5e2-34af484ddfcc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be96e3e8-4879-473f-b5e2-34af484ddfcc\") " pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.308450 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89zfj\" (UniqueName: \"kubernetes.io/projected/be96e3e8-4879-473f-b5e2-34af484ddfcc-kube-api-access-89zfj\") pod \"openstackclient\" (UID: \"be96e3e8-4879-473f-b5e2-34af484ddfcc\") " pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.308484 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be96e3e8-4879-473f-b5e2-34af484ddfcc-openstack-config\") pod \"openstackclient\" (UID: \"be96e3e8-4879-473f-b5e2-34af484ddfcc\") " pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.410448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be96e3e8-4879-473f-b5e2-34af484ddfcc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be96e3e8-4879-473f-b5e2-34af484ddfcc\") " pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.410743 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89zfj\" (UniqueName: \"kubernetes.io/projected/be96e3e8-4879-473f-b5e2-34af484ddfcc-kube-api-access-89zfj\") pod \"openstackclient\" (UID: \"be96e3e8-4879-473f-b5e2-34af484ddfcc\") " pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.410781 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be96e3e8-4879-473f-b5e2-34af484ddfcc-openstack-config\") pod \"openstackclient\" (UID: \"be96e3e8-4879-473f-b5e2-34af484ddfcc\") " pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.410882 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be96e3e8-4879-473f-b5e2-34af484ddfcc-openstack-config-secret\") pod \"openstackclient\" (UID: \"be96e3e8-4879-473f-b5e2-34af484ddfcc\") " pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.412297 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be96e3e8-4879-473f-b5e2-34af484ddfcc-openstack-config\") pod \"openstackclient\" (UID: \"be96e3e8-4879-473f-b5e2-34af484ddfcc\") " pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.417403 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be96e3e8-4879-473f-b5e2-34af484ddfcc-openstack-config-secret\") pod \"openstackclient\" (UID: \"be96e3e8-4879-473f-b5e2-34af484ddfcc\") " pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.424071 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be96e3e8-4879-473f-b5e2-34af484ddfcc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be96e3e8-4879-473f-b5e2-34af484ddfcc\") " pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.455472 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89zfj\" (UniqueName: \"kubernetes.io/projected/be96e3e8-4879-473f-b5e2-34af484ddfcc-kube-api-access-89zfj\") pod \"openstackclient\" (UID: \"be96e3e8-4879-473f-b5e2-34af484ddfcc\") " pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.464245 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.480665 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 15:12:31 crc kubenswrapper[4764]: I0320 15:12:31.858338 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 15:12:31 crc kubenswrapper[4764]: W0320 15:12:31.859648 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda766ed7e_8243_41e5_b3b1_bc3bdd0e069f.slice/crio-1409b56f3e74d9efe9c6800862e9f2fa7271a2f5e21a863a681732908265c295 WatchSource:0}: Error finding container 1409b56f3e74d9efe9c6800862e9f2fa7271a2f5e21a863a681732908265c295: Status 404 returned error can't find the container with id 1409b56f3e74d9efe9c6800862e9f2fa7271a2f5e21a863a681732908265c295 Mar 20 15:12:32 crc kubenswrapper[4764]: I0320 15:12:32.043448 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 15:12:32 crc kubenswrapper[4764]: W0320 15:12:32.052768 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe96e3e8_4879_473f_b5e2_34af484ddfcc.slice/crio-e9e2c398a088135d942b83c725eb67c848eda57df4b7a26759650f41a097506e WatchSource:0}: Error finding container e9e2c398a088135d942b83c725eb67c848eda57df4b7a26759650f41a097506e: Status 404 returned error can't find the container with id e9e2c398a088135d942b83c725eb67c848eda57df4b7a26759650f41a097506e Mar 20 15:12:32 crc kubenswrapper[4764]: I0320 15:12:32.872172 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"be96e3e8-4879-473f-b5e2-34af484ddfcc","Type":"ContainerStarted","Data":"e9e2c398a088135d942b83c725eb67c848eda57df4b7a26759650f41a097506e"} Mar 20 15:12:32 crc kubenswrapper[4764]: I0320 15:12:32.875120 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f","Type":"ContainerStarted","Data":"e43673973e93e5ca1495c182160c008d422fd120cf9fe5470faaabbd275bee88"} Mar 20 15:12:32 crc kubenswrapper[4764]: I0320 15:12:32.875462 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f","Type":"ContainerStarted","Data":"1409b56f3e74d9efe9c6800862e9f2fa7271a2f5e21a863a681732908265c295"} Mar 20 15:12:33 crc kubenswrapper[4764]: I0320 15:12:33.885825 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a766ed7e-8243-41e5-b3b1-bc3bdd0e069f","Type":"ContainerStarted","Data":"b3aab5dda34890a8fa86980907dc5a37a7b62dd15677e53c98b30a378dd1b415"} Mar 20 15:12:33 crc kubenswrapper[4764]: I0320 15:12:33.915024 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.915001384 podStartE2EDuration="3.915001384s" podCreationTimestamp="2026-03-20 15:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:33.90766977 +0000 UTC m=+1275.523858899" watchObservedRunningTime="2026-03-20 15:12:33.915001384 +0000 UTC m=+1275.531190513" Mar 20 15:12:34 crc kubenswrapper[4764]: I0320 15:12:34.751796 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66899c9d8-zh5gp" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 20 15:12:34 crc kubenswrapper[4764]: I0320 15:12:34.892427 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:12:34 crc kubenswrapper[4764]: I0320 15:12:34.896688 4764 generic.go:334] "Generic (PLEG): container finished" podID="2738d26f-7f78-45d6-a3e4-5ad8ac27c237" containerID="ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e" exitCode=137 Mar 20 15:12:34 crc kubenswrapper[4764]: I0320 15:12:34.896746 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" event={"ID":"2738d26f-7f78-45d6-a3e4-5ad8ac27c237","Type":"ContainerDied","Data":"ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e"} Mar 20 15:12:34 crc kubenswrapper[4764]: I0320 15:12:34.896769 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" event={"ID":"2738d26f-7f78-45d6-a3e4-5ad8ac27c237","Type":"ContainerDied","Data":"4a8ae80f73d8852b82e8d397426ecb149e483377595a75475ebf053325df20a7"} Mar 20 15:12:34 crc kubenswrapper[4764]: I0320 15:12:34.896785 4764 scope.go:117] "RemoveContainer" containerID="ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e" Mar 20 15:12:34 crc kubenswrapper[4764]: I0320 15:12:34.896782 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-678fbbd7fd-2hzg8" Mar 20 15:12:34 crc kubenswrapper[4764]: I0320 15:12:34.902987 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:12:34 crc kubenswrapper[4764]: I0320 15:12:34.910110 4764 generic.go:334] "Generic (PLEG): container finished" podID="96e6e687-6b83-45b0-b616-aeafd9f0faa4" containerID="bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce" exitCode=137 Mar 20 15:12:34 crc kubenswrapper[4764]: I0320 15:12:34.910472 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6586fcc5bc-m65hd" event={"ID":"96e6e687-6b83-45b0-b616-aeafd9f0faa4","Type":"ContainerDied","Data":"bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce"} Mar 20 15:12:34 crc kubenswrapper[4764]: I0320 15:12:34.910522 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6586fcc5bc-m65hd" event={"ID":"96e6e687-6b83-45b0-b616-aeafd9f0faa4","Type":"ContainerDied","Data":"e06f1e262f5d329a30b57b703e20b6b4d79addde2c20c4d3b6114e36301e112c"} Mar 20 15:12:34 crc kubenswrapper[4764]: I0320 15:12:34.929533 4764 scope.go:117] "RemoveContainer" containerID="d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.006911 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-logs\") pod \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.007009 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-config-data-custom\") pod \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.007070 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7tkd\" (UniqueName: \"kubernetes.io/projected/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-kube-api-access-k7tkd\") pod \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.007134 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e6e687-6b83-45b0-b616-aeafd9f0faa4-logs\") pod \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.007156 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-combined-ca-bundle\") pod \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.007174 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4llj\" (UniqueName: \"kubernetes.io/projected/96e6e687-6b83-45b0-b616-aeafd9f0faa4-kube-api-access-g4llj\") pod \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.007200 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-config-data\") pod \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.007259 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-combined-ca-bundle\") pod \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.007291 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-config-data\") pod \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\" (UID: \"2738d26f-7f78-45d6-a3e4-5ad8ac27c237\") " Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.007308 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-config-data-custom\") pod \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\" (UID: \"96e6e687-6b83-45b0-b616-aeafd9f0faa4\") " Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.008778 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-logs" (OuterVolumeSpecName: "logs") pod "2738d26f-7f78-45d6-a3e4-5ad8ac27c237" (UID: "2738d26f-7f78-45d6-a3e4-5ad8ac27c237"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.028829 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96e6e687-6b83-45b0-b616-aeafd9f0faa4-logs" (OuterVolumeSpecName: "logs") pod "96e6e687-6b83-45b0-b616-aeafd9f0faa4" (UID: "96e6e687-6b83-45b0-b616-aeafd9f0faa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.052193 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "96e6e687-6b83-45b0-b616-aeafd9f0faa4" (UID: "96e6e687-6b83-45b0-b616-aeafd9f0faa4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.060600 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-kube-api-access-k7tkd" (OuterVolumeSpecName: "kube-api-access-k7tkd") pod "2738d26f-7f78-45d6-a3e4-5ad8ac27c237" (UID: "2738d26f-7f78-45d6-a3e4-5ad8ac27c237"). InnerVolumeSpecName "kube-api-access-k7tkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.060702 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e6e687-6b83-45b0-b616-aeafd9f0faa4-kube-api-access-g4llj" (OuterVolumeSpecName: "kube-api-access-g4llj") pod "96e6e687-6b83-45b0-b616-aeafd9f0faa4" (UID: "96e6e687-6b83-45b0-b616-aeafd9f0faa4"). InnerVolumeSpecName "kube-api-access-g4llj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.060755 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2738d26f-7f78-45d6-a3e4-5ad8ac27c237" (UID: "2738d26f-7f78-45d6-a3e4-5ad8ac27c237"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.060862 4764 scope.go:117] "RemoveContainer" containerID="ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e" Mar 20 15:12:35 crc kubenswrapper[4764]: E0320 15:12:35.064513 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e\": container with ID starting with ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e not found: ID does not exist" containerID="ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.064559 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e"} err="failed to get container status \"ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e\": rpc error: code = NotFound desc = could not find container \"ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e\": container with ID starting with ab0d3b97db03b59b5338007bc4d38fec6ab3f21fcf751263403121548314b18e not found: ID does not exist" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.064583 4764 scope.go:117] "RemoveContainer" containerID="d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04" Mar 20 15:12:35 crc kubenswrapper[4764]: E0320 15:12:35.082845 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04\": container with ID starting with d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04 not found: ID does not exist" containerID="d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.082888 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04"} err="failed to get container status \"d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04\": rpc error: code = NotFound desc = could not find container \"d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04\": container with ID starting with d0f6af767210e05a3ea05b2059f4a73f900bbd7234e859ebded3d4c830eabe04 not found: ID does not exist" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.082912 4764 scope.go:117] "RemoveContainer" containerID="bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.109602 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.109641 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.109651 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7tkd\" (UniqueName: \"kubernetes.io/projected/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-kube-api-access-k7tkd\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.109659 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e6e687-6b83-45b0-b616-aeafd9f0faa4-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.109673 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4llj\" (UniqueName: \"kubernetes.io/projected/96e6e687-6b83-45b0-b616-aeafd9f0faa4-kube-api-access-g4llj\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.109685 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.119889 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96e6e687-6b83-45b0-b616-aeafd9f0faa4" (UID: "96e6e687-6b83-45b0-b616-aeafd9f0faa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.139124 4764 scope.go:117] "RemoveContainer" containerID="d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.147566 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2738d26f-7f78-45d6-a3e4-5ad8ac27c237" (UID: "2738d26f-7f78-45d6-a3e4-5ad8ac27c237"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.170567 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-config-data" (OuterVolumeSpecName: "config-data") pod "2738d26f-7f78-45d6-a3e4-5ad8ac27c237" (UID: "2738d26f-7f78-45d6-a3e4-5ad8ac27c237"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.173283 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-config-data" (OuterVolumeSpecName: "config-data") pod "96e6e687-6b83-45b0-b616-aeafd9f0faa4" (UID: "96e6e687-6b83-45b0-b616-aeafd9f0faa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.173312 4764 scope.go:117] "RemoveContainer" containerID="bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce" Mar 20 15:12:35 crc kubenswrapper[4764]: E0320 15:12:35.173834 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce\": container with ID starting with bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce not found: ID does not exist" containerID="bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.173863 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce"} err="failed to get container status \"bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce\": rpc error: code = NotFound desc = could not find container \"bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce\": container with ID starting with bdddcc7f05f30b0033dc0e7491dabc7c238df8d21b4e45417a7a7621c9e3f0ce not found: ID does not exist" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.173886 4764 scope.go:117] "RemoveContainer" containerID="d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61" Mar 20 15:12:35 crc kubenswrapper[4764]: E0320 15:12:35.174212 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61\": container with ID starting with d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61 not found: ID does not exist" containerID="d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.174232 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61"} err="failed to get container status \"d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61\": rpc error: code = NotFound desc = could not find container \"d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61\": container with ID starting with d99b685ab643c5606eb99be2f405ff460831b748dd8066a83e58acec8f9beb61 not found: ID does not exist" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.211540 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.211569 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.211579 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96e6e687-6b83-45b0-b616-aeafd9f0faa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.211587 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2738d26f-7f78-45d6-a3e4-5ad8ac27c237-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.237201 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-678fbbd7fd-2hzg8"] Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.246728 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-678fbbd7fd-2hzg8"] Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.924828 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6586fcc5bc-m65hd" Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.956869 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6586fcc5bc-m65hd"] Mar 20 15:12:35 crc kubenswrapper[4764]: I0320 15:12:35.966037 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6586fcc5bc-m65hd"] Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.264140 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.404268 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7957c5bfd5-lfmpr"] Mar 20 15:12:36 crc kubenswrapper[4764]: E0320 15:12:36.404674 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2738d26f-7f78-45d6-a3e4-5ad8ac27c237" containerName="barbican-keystone-listener" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.404690 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2738d26f-7f78-45d6-a3e4-5ad8ac27c237" containerName="barbican-keystone-listener" Mar 20 15:12:36 crc kubenswrapper[4764]: E0320 15:12:36.404703 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e6e687-6b83-45b0-b616-aeafd9f0faa4" containerName="barbican-worker" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.404709 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e6e687-6b83-45b0-b616-aeafd9f0faa4" containerName="barbican-worker" Mar 20 15:12:36 crc kubenswrapper[4764]: E0320 15:12:36.404727 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e6e687-6b83-45b0-b616-aeafd9f0faa4" containerName="barbican-worker-log" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.404733 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e6e687-6b83-45b0-b616-aeafd9f0faa4" containerName="barbican-worker-log" Mar 20 15:12:36 crc kubenswrapper[4764]: E0320 15:12:36.404747 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2738d26f-7f78-45d6-a3e4-5ad8ac27c237" containerName="barbican-keystone-listener-log" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.404752 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2738d26f-7f78-45d6-a3e4-5ad8ac27c237" containerName="barbican-keystone-listener-log" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.404902 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2738d26f-7f78-45d6-a3e4-5ad8ac27c237" containerName="barbican-keystone-listener" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.404918 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e6e687-6b83-45b0-b616-aeafd9f0faa4" containerName="barbican-worker-log" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.404929 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2738d26f-7f78-45d6-a3e4-5ad8ac27c237" containerName="barbican-keystone-listener-log" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.404943 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e6e687-6b83-45b0-b616-aeafd9f0faa4" containerName="barbican-worker" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.405787 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.412888 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.413262 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.413506 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.430229 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7957c5bfd5-lfmpr"] Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.546334 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/884c1077-0801-4570-b8ef-195767d65d2c-internal-tls-certs\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.546401 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd6s7\" (UniqueName: \"kubernetes.io/projected/884c1077-0801-4570-b8ef-195767d65d2c-kube-api-access-dd6s7\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.546448 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884c1077-0801-4570-b8ef-195767d65d2c-run-httpd\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.546488 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/884c1077-0801-4570-b8ef-195767d65d2c-public-tls-certs\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.546510 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884c1077-0801-4570-b8ef-195767d65d2c-log-httpd\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.546568 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884c1077-0801-4570-b8ef-195767d65d2c-combined-ca-bundle\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.546779 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/884c1077-0801-4570-b8ef-195767d65d2c-etc-swift\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.546925 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884c1077-0801-4570-b8ef-195767d65d2c-config-data\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.648647 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884c1077-0801-4570-b8ef-195767d65d2c-config-data\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.648696 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/884c1077-0801-4570-b8ef-195767d65d2c-internal-tls-certs\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.648756 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd6s7\" (UniqueName: \"kubernetes.io/projected/884c1077-0801-4570-b8ef-195767d65d2c-kube-api-access-dd6s7\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.648778 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884c1077-0801-4570-b8ef-195767d65d2c-run-httpd\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.648813 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/884c1077-0801-4570-b8ef-195767d65d2c-public-tls-certs\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.648833 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884c1077-0801-4570-b8ef-195767d65d2c-log-httpd\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.648847 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884c1077-0801-4570-b8ef-195767d65d2c-combined-ca-bundle\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.648905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/884c1077-0801-4570-b8ef-195767d65d2c-etc-swift\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.653953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/884c1077-0801-4570-b8ef-195767d65d2c-etc-swift\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.656056 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884c1077-0801-4570-b8ef-195767d65d2c-log-httpd\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.656129 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/884c1077-0801-4570-b8ef-195767d65d2c-run-httpd\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.659481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884c1077-0801-4570-b8ef-195767d65d2c-config-data\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.660136 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884c1077-0801-4570-b8ef-195767d65d2c-combined-ca-bundle\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.660324 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/884c1077-0801-4570-b8ef-195767d65d2c-public-tls-certs\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.668329 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/884c1077-0801-4570-b8ef-195767d65d2c-internal-tls-certs\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.681541 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd6s7\" (UniqueName: \"kubernetes.io/projected/884c1077-0801-4570-b8ef-195767d65d2c-kube-api-access-dd6s7\") pod \"swift-proxy-7957c5bfd5-lfmpr\" (UID: \"884c1077-0801-4570-b8ef-195767d65d2c\") " pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:36 crc kubenswrapper[4764]: I0320 15:12:36.723513 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.067752 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.068591 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="ceilometer-central-agent" containerID="cri-o://4753aae83fa557447da001506741b8de664773d731ea8688f8f37831d3b3a923" gracePeriod=30 Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.068638 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="sg-core" containerID="cri-o://abd5385a4d46cdbb7f4c861a18ad57c1f6dfe1d8e546ff46a0ad00a7e2e062ba" gracePeriod=30 Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.068674 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="ceilometer-notification-agent" containerID="cri-o://20007b377ef0b58fe74f1ff6da8740ea3a1952bc3d29ce710786ad6ece83d461" gracePeriod=30 Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.068716 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="proxy-httpd" containerID="cri-o://a35adaac14cf8966b239cd84c8698aa9db383b666dff5a4386d8334e2c741da8" gracePeriod=30 Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.086414 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.173:3000/\": EOF" Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.135663 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2738d26f-7f78-45d6-a3e4-5ad8ac27c237" path="/var/lib/kubelet/pods/2738d26f-7f78-45d6-a3e4-5ad8ac27c237/volumes" Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.136733 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e6e687-6b83-45b0-b616-aeafd9f0faa4" path="/var/lib/kubelet/pods/96e6e687-6b83-45b0-b616-aeafd9f0faa4/volumes" Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.277404 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7957c5bfd5-lfmpr"] Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.954654 4764 generic.go:334] "Generic (PLEG): container finished" podID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerID="a35adaac14cf8966b239cd84c8698aa9db383b666dff5a4386d8334e2c741da8" exitCode=0 Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.954683 4764 generic.go:334] "Generic (PLEG): container finished" podID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerID="abd5385a4d46cdbb7f4c861a18ad57c1f6dfe1d8e546ff46a0ad00a7e2e062ba" exitCode=2 Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.954693 4764 generic.go:334] "Generic (PLEG): container finished" podID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerID="20007b377ef0b58fe74f1ff6da8740ea3a1952bc3d29ce710786ad6ece83d461" exitCode=0 Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.954702 4764 generic.go:334] "Generic (PLEG): container finished" podID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerID="4753aae83fa557447da001506741b8de664773d731ea8688f8f37831d3b3a923" exitCode=0 Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.954729 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bcaf93d-1c10-473b-9e49-641700b416d9","Type":"ContainerDied","Data":"a35adaac14cf8966b239cd84c8698aa9db383b666dff5a4386d8334e2c741da8"} Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.954769 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bcaf93d-1c10-473b-9e49-641700b416d9","Type":"ContainerDied","Data":"abd5385a4d46cdbb7f4c861a18ad57c1f6dfe1d8e546ff46a0ad00a7e2e062ba"} Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.954781 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bcaf93d-1c10-473b-9e49-641700b416d9","Type":"ContainerDied","Data":"20007b377ef0b58fe74f1ff6da8740ea3a1952bc3d29ce710786ad6ece83d461"} Mar 20 15:12:37 crc kubenswrapper[4764]: I0320 15:12:37.954792 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bcaf93d-1c10-473b-9e49-641700b416d9","Type":"ContainerDied","Data":"4753aae83fa557447da001506741b8de664773d731ea8688f8f37831d3b3a923"} Mar 20 15:12:41 crc kubenswrapper[4764]: I0320 15:12:41.481396 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.830581 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.959261 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bcaf93d-1c10-473b-9e49-641700b416d9-log-httpd\") pod \"9bcaf93d-1c10-473b-9e49-641700b416d9\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.959412 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-config-data\") pod \"9bcaf93d-1c10-473b-9e49-641700b416d9\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.959453 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssjjd\" (UniqueName: \"kubernetes.io/projected/9bcaf93d-1c10-473b-9e49-641700b416d9-kube-api-access-ssjjd\") pod \"9bcaf93d-1c10-473b-9e49-641700b416d9\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.959521 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-scripts\") pod \"9bcaf93d-1c10-473b-9e49-641700b416d9\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.959597 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-sg-core-conf-yaml\") pod \"9bcaf93d-1c10-473b-9e49-641700b416d9\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.959637 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bcaf93d-1c10-473b-9e49-641700b416d9-run-httpd\") pod \"9bcaf93d-1c10-473b-9e49-641700b416d9\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.959714 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-combined-ca-bundle\") pod \"9bcaf93d-1c10-473b-9e49-641700b416d9\" (UID: \"9bcaf93d-1c10-473b-9e49-641700b416d9\") " Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.959962 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bcaf93d-1c10-473b-9e49-641700b416d9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9bcaf93d-1c10-473b-9e49-641700b416d9" (UID: "9bcaf93d-1c10-473b-9e49-641700b416d9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.960487 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bcaf93d-1c10-473b-9e49-641700b416d9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9bcaf93d-1c10-473b-9e49-641700b416d9" (UID: "9bcaf93d-1c10-473b-9e49-641700b416d9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.960592 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bcaf93d-1c10-473b-9e49-641700b416d9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.964071 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bcaf93d-1c10-473b-9e49-641700b416d9-kube-api-access-ssjjd" (OuterVolumeSpecName: "kube-api-access-ssjjd") pod "9bcaf93d-1c10-473b-9e49-641700b416d9" (UID: "9bcaf93d-1c10-473b-9e49-641700b416d9"). InnerVolumeSpecName "kube-api-access-ssjjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:42 crc kubenswrapper[4764]: I0320 15:12:42.973648 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-scripts" (OuterVolumeSpecName: "scripts") pod "9bcaf93d-1c10-473b-9e49-641700b416d9" (UID: "9bcaf93d-1c10-473b-9e49-641700b416d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.000001 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9bcaf93d-1c10-473b-9e49-641700b416d9" (UID: "9bcaf93d-1c10-473b-9e49-641700b416d9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.002565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7957c5bfd5-lfmpr" event={"ID":"884c1077-0801-4570-b8ef-195767d65d2c","Type":"ContainerStarted","Data":"79d1401b6d294f4c0212660d9cabf49a917873bf739089d1e661fff6b1e8dd0f"} Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.002607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7957c5bfd5-lfmpr" event={"ID":"884c1077-0801-4570-b8ef-195767d65d2c","Type":"ContainerStarted","Data":"a53e2d53f2e4bdb39d1350ec5989e100a48a9dee977fc80f92050805ae65083f"} Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.006370 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.007329 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bcaf93d-1c10-473b-9e49-641700b416d9","Type":"ContainerDied","Data":"61385bb6ab46ab4050da6c639b3176cf54acc08ec3954a3d1136e72a6378e8d3"} Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.007365 4764 scope.go:117] "RemoveContainer" containerID="a35adaac14cf8966b239cd84c8698aa9db383b666dff5a4386d8334e2c741da8" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.011803 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"be96e3e8-4879-473f-b5e2-34af484ddfcc","Type":"ContainerStarted","Data":"a35eaf6c93dcb068d0ded44e1536a32765331e64ada41ecae1651a7e4ecd63f5"} Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.029596 4764 scope.go:117] "RemoveContainer" containerID="abd5385a4d46cdbb7f4c861a18ad57c1f6dfe1d8e546ff46a0ad00a7e2e062ba" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.034118 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.470839359 podStartE2EDuration="12.034082172s" podCreationTimestamp="2026-03-20 15:12:31 +0000 UTC" firstStartedPulling="2026-03-20 15:12:32.057359611 +0000 UTC m=+1273.673548740" lastFinishedPulling="2026-03-20 15:12:42.620602424 +0000 UTC m=+1284.236791553" observedRunningTime="2026-03-20 15:12:43.029822152 +0000 UTC m=+1284.646011281" watchObservedRunningTime="2026-03-20 15:12:43.034082172 +0000 UTC m=+1284.650271311" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.051998 4764 scope.go:117] "RemoveContainer" containerID="20007b377ef0b58fe74f1ff6da8740ea3a1952bc3d29ce710786ad6ece83d461" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.062291 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.062327 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bcaf93d-1c10-473b-9e49-641700b416d9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.062340 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssjjd\" (UniqueName: \"kubernetes.io/projected/9bcaf93d-1c10-473b-9e49-641700b416d9-kube-api-access-ssjjd\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.062369 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.070135 4764 scope.go:117] "RemoveContainer" containerID="4753aae83fa557447da001506741b8de664773d731ea8688f8f37831d3b3a923" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.083807 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bcaf93d-1c10-473b-9e49-641700b416d9" (UID: "9bcaf93d-1c10-473b-9e49-641700b416d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.097563 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-config-data" (OuterVolumeSpecName: "config-data") pod "9bcaf93d-1c10-473b-9e49-641700b416d9" (UID: "9bcaf93d-1c10-473b-9e49-641700b416d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.164661 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.164697 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bcaf93d-1c10-473b-9e49-641700b416d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.255968 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.256223 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f03615c4-2b7b-4db8-8706-c47f8399c808" containerName="glance-log" containerID="cri-o://b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442" gracePeriod=30 Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.256298 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f03615c4-2b7b-4db8-8706-c47f8399c808" containerName="glance-httpd" containerID="cri-o://07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de" gracePeriod=30 Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.328646 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.337288 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.346329 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:43 crc kubenswrapper[4764]: E0320 15:12:43.346657 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="ceilometer-notification-agent" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.346671 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="ceilometer-notification-agent" Mar 20 15:12:43 crc kubenswrapper[4764]: E0320 15:12:43.346681 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="sg-core" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.346687 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="sg-core" Mar 20 15:12:43 crc kubenswrapper[4764]: E0320 15:12:43.346706 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="proxy-httpd" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.346711 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="proxy-httpd" Mar 20 15:12:43 crc kubenswrapper[4764]: E0320 15:12:43.346723 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="ceilometer-central-agent" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.346736 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="ceilometer-central-agent" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.346940 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="ceilometer-notification-agent" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.346956 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="proxy-httpd" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.346973 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="sg-core" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.346990 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" containerName="ceilometer-central-agent" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.348356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.350327 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.350621 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.366004 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.470300 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.470366 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-config-data\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.470491 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.470574 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7e4981-eaae-471f-805b-a50e2b37d786-run-httpd\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.470752 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s6px\" (UniqueName: \"kubernetes.io/projected/5d7e4981-eaae-471f-805b-a50e2b37d786-kube-api-access-4s6px\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.470852 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-scripts\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.471027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7e4981-eaae-471f-805b-a50e2b37d786-log-httpd\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.525038 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:43 crc kubenswrapper[4764]: E0320 15:12:43.526314 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-4s6px log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="5d7e4981-eaae-471f-805b-a50e2b37d786" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.572953 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-scripts\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.573044 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7e4981-eaae-471f-805b-a50e2b37d786-log-httpd\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.573084 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.573115 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-config-data\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.573145 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.573182 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7e4981-eaae-471f-805b-a50e2b37d786-run-httpd\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.573264 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s6px\" (UniqueName: \"kubernetes.io/projected/5d7e4981-eaae-471f-805b-a50e2b37d786-kube-api-access-4s6px\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.573863 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7e4981-eaae-471f-805b-a50e2b37d786-log-httpd\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.573935 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7e4981-eaae-471f-805b-a50e2b37d786-run-httpd\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.579108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.579120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.580089 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-scripts\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.581522 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-config-data\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:43 crc kubenswrapper[4764]: I0320 15:12:43.589800 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s6px\" (UniqueName: \"kubernetes.io/projected/5d7e4981-eaae-471f-805b-a50e2b37d786-kube-api-access-4s6px\") pod \"ceilometer-0\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " pod="openstack/ceilometer-0" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.021635 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7957c5bfd5-lfmpr" event={"ID":"884c1077-0801-4570-b8ef-195767d65d2c","Type":"ContainerStarted","Data":"f9f92b22967fd434f6131c0342fa651eb65bf9f137c7dfa985aa1e87811839c8"} Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.022015 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.022029 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.024981 4764 generic.go:334] "Generic (PLEG): container finished" podID="f03615c4-2b7b-4db8-8706-c47f8399c808" containerID="b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442" exitCode=143 Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.025052 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f03615c4-2b7b-4db8-8706-c47f8399c808","Type":"ContainerDied","Data":"b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442"} Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.025060 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.039554 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.057173 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.057430 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" containerName="glance-log" containerID="cri-o://4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b" gracePeriod=30 Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.057510 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" containerName="glance-httpd" containerID="cri-o://111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b" gracePeriod=30 Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.062669 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7957c5bfd5-lfmpr" podStartSLOduration=8.062654245 podStartE2EDuration="8.062654245s" podCreationTimestamp="2026-03-20 15:12:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:44.058769816 +0000 UTC m=+1285.674958945" watchObservedRunningTime="2026-03-20 15:12:44.062654245 +0000 UTC m=+1285.678843374" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.184136 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-scripts\") pod \"5d7e4981-eaae-471f-805b-a50e2b37d786\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.184185 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7e4981-eaae-471f-805b-a50e2b37d786-log-httpd\") pod \"5d7e4981-eaae-471f-805b-a50e2b37d786\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.184212 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7e4981-eaae-471f-805b-a50e2b37d786-run-httpd\") pod \"5d7e4981-eaae-471f-805b-a50e2b37d786\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.184237 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-sg-core-conf-yaml\") pod \"5d7e4981-eaae-471f-805b-a50e2b37d786\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.184286 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-combined-ca-bundle\") pod \"5d7e4981-eaae-471f-805b-a50e2b37d786\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.184371 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-config-data\") pod \"5d7e4981-eaae-471f-805b-a50e2b37d786\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.184457 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s6px\" (UniqueName: \"kubernetes.io/projected/5d7e4981-eaae-471f-805b-a50e2b37d786-kube-api-access-4s6px\") pod \"5d7e4981-eaae-471f-805b-a50e2b37d786\" (UID: \"5d7e4981-eaae-471f-805b-a50e2b37d786\") " Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.186086 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d7e4981-eaae-471f-805b-a50e2b37d786-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5d7e4981-eaae-471f-805b-a50e2b37d786" (UID: "5d7e4981-eaae-471f-805b-a50e2b37d786"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.189225 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d7e4981-eaae-471f-805b-a50e2b37d786-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5d7e4981-eaae-471f-805b-a50e2b37d786" (UID: "5d7e4981-eaae-471f-805b-a50e2b37d786"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.203669 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-scripts" (OuterVolumeSpecName: "scripts") pod "5d7e4981-eaae-471f-805b-a50e2b37d786" (UID: "5d7e4981-eaae-471f-805b-a50e2b37d786"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.204518 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d7e4981-eaae-471f-805b-a50e2b37d786-kube-api-access-4s6px" (OuterVolumeSpecName: "kube-api-access-4s6px") pod "5d7e4981-eaae-471f-805b-a50e2b37d786" (UID: "5d7e4981-eaae-471f-805b-a50e2b37d786"). InnerVolumeSpecName "kube-api-access-4s6px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.204526 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d7e4981-eaae-471f-805b-a50e2b37d786" (UID: "5d7e4981-eaae-471f-805b-a50e2b37d786"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.204571 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-config-data" (OuterVolumeSpecName: "config-data") pod "5d7e4981-eaae-471f-805b-a50e2b37d786" (UID: "5d7e4981-eaae-471f-805b-a50e2b37d786"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.211119 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5d7e4981-eaae-471f-805b-a50e2b37d786" (UID: "5d7e4981-eaae-471f-805b-a50e2b37d786"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.286315 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.286345 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.286355 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s6px\" (UniqueName: \"kubernetes.io/projected/5d7e4981-eaae-471f-805b-a50e2b37d786-kube-api-access-4s6px\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.286364 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.286373 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7e4981-eaae-471f-805b-a50e2b37d786-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.286452 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d7e4981-eaae-471f-805b-a50e2b37d786-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.286460 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d7e4981-eaae-471f-805b-a50e2b37d786-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.752777 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66899c9d8-zh5gp" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Mar 20 15:12:44 crc kubenswrapper[4764]: I0320 15:12:44.753225 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.035707 4764 generic.go:334] "Generic (PLEG): container finished" podID="5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" containerID="4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b" exitCode=143 Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.036689 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68","Type":"ContainerDied","Data":"4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b"} Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.036754 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.110209 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.140664 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bcaf93d-1c10-473b-9e49-641700b416d9" path="/var/lib/kubelet/pods/9bcaf93d-1c10-473b-9e49-641700b416d9/volumes" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.142024 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.142058 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.144590 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.146939 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.146940 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.147791 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.306119 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-config-data\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.306493 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7t8z\" (UniqueName: \"kubernetes.io/projected/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-kube-api-access-k7t8z\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.306647 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.306815 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-log-httpd\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.307296 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-run-httpd\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.307782 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.307884 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-scripts\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.409361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.409456 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-scripts\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.409528 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-config-data\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.409562 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7t8z\" (UniqueName: \"kubernetes.io/projected/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-kube-api-access-k7t8z\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.409581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.409616 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-log-httpd\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.409632 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-run-httpd\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.410412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-run-httpd\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.410626 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-log-httpd\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.414105 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-scripts\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.415644 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-config-data\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.418182 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.429834 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.430114 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7t8z\" (UniqueName: \"kubernetes.io/projected/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-kube-api-access-k7t8z\") pod \"ceilometer-0\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.470570 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:45 crc kubenswrapper[4764]: I0320 15:12:45.825646 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:46 crc kubenswrapper[4764]: I0320 15:12:46.024035 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:46 crc kubenswrapper[4764]: I0320 15:12:46.051105 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4","Type":"ContainerStarted","Data":"0637e229b86530d6f27f8246e9157bdd28fbfe75b3ba8b122a913131d8540601"} Mar 20 15:12:46 crc kubenswrapper[4764]: I0320 15:12:46.946823 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.041871 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f03615c4-2b7b-4db8-8706-c47f8399c808-httpd-run\") pod \"f03615c4-2b7b-4db8-8706-c47f8399c808\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.041933 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f03615c4-2b7b-4db8-8706-c47f8399c808\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.041969 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-scripts\") pod \"f03615c4-2b7b-4db8-8706-c47f8399c808\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.042015 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs242\" (UniqueName: \"kubernetes.io/projected/f03615c4-2b7b-4db8-8706-c47f8399c808-kube-api-access-rs242\") pod \"f03615c4-2b7b-4db8-8706-c47f8399c808\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.042045 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-config-data\") pod \"f03615c4-2b7b-4db8-8706-c47f8399c808\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.042098 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-public-tls-certs\") pod \"f03615c4-2b7b-4db8-8706-c47f8399c808\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.042150 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f03615c4-2b7b-4db8-8706-c47f8399c808-logs\") pod \"f03615c4-2b7b-4db8-8706-c47f8399c808\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.042521 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-combined-ca-bundle\") pod \"f03615c4-2b7b-4db8-8706-c47f8399c808\" (UID: \"f03615c4-2b7b-4db8-8706-c47f8399c808\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.043011 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f03615c4-2b7b-4db8-8706-c47f8399c808-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f03615c4-2b7b-4db8-8706-c47f8399c808" (UID: "f03615c4-2b7b-4db8-8706-c47f8399c808"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.043500 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f03615c4-2b7b-4db8-8706-c47f8399c808-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.044755 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f03615c4-2b7b-4db8-8706-c47f8399c808-logs" (OuterVolumeSpecName: "logs") pod "f03615c4-2b7b-4db8-8706-c47f8399c808" (UID: "f03615c4-2b7b-4db8-8706-c47f8399c808"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.050218 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-scripts" (OuterVolumeSpecName: "scripts") pod "f03615c4-2b7b-4db8-8706-c47f8399c808" (UID: "f03615c4-2b7b-4db8-8706-c47f8399c808"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.059630 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "f03615c4-2b7b-4db8-8706-c47f8399c808" (UID: "f03615c4-2b7b-4db8-8706-c47f8399c808"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.077586 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f03615c4-2b7b-4db8-8706-c47f8399c808-kube-api-access-rs242" (OuterVolumeSpecName: "kube-api-access-rs242") pod "f03615c4-2b7b-4db8-8706-c47f8399c808" (UID: "f03615c4-2b7b-4db8-8706-c47f8399c808"). InnerVolumeSpecName "kube-api-access-rs242". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.079275 4764 generic.go:334] "Generic (PLEG): container finished" podID="f03615c4-2b7b-4db8-8706-c47f8399c808" containerID="07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de" exitCode=0 Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.079359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f03615c4-2b7b-4db8-8706-c47f8399c808","Type":"ContainerDied","Data":"07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de"} Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.079467 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f03615c4-2b7b-4db8-8706-c47f8399c808","Type":"ContainerDied","Data":"0642b85b1a3de9d092fcfbfb02f0c0a1045297773c800c161ae5a49cbeb09525"} Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.079500 4764 scope.go:117] "RemoveContainer" containerID="07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.079656 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.084768 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4","Type":"ContainerStarted","Data":"aff10edcbcd528d8f56084515f232557c5b6a4baee39abbeb159f08144448006"} Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.100104 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-config-data" (OuterVolumeSpecName: "config-data") pod "f03615c4-2b7b-4db8-8706-c47f8399c808" (UID: "f03615c4-2b7b-4db8-8706-c47f8399c808"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.112500 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f03615c4-2b7b-4db8-8706-c47f8399c808" (UID: "f03615c4-2b7b-4db8-8706-c47f8399c808"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.113769 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f03615c4-2b7b-4db8-8706-c47f8399c808" (UID: "f03615c4-2b7b-4db8-8706-c47f8399c808"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.121420 4764 scope.go:117] "RemoveContainer" containerID="b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.135850 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d7e4981-eaae-471f-805b-a50e2b37d786" path="/var/lib/kubelet/pods/5d7e4981-eaae-471f-805b-a50e2b37d786/volumes" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.142635 4764 scope.go:117] "RemoveContainer" containerID="07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.144747 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.144854 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.144870 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.144881 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs242\" (UniqueName: \"kubernetes.io/projected/f03615c4-2b7b-4db8-8706-c47f8399c808-kube-api-access-rs242\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.144891 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.144898 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f03615c4-2b7b-4db8-8706-c47f8399c808-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.144906 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f03615c4-2b7b-4db8-8706-c47f8399c808-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:47 crc kubenswrapper[4764]: E0320 15:12:47.148871 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de\": container with ID starting with 07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de not found: ID does not exist" containerID="07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.149145 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de"} err="failed to get container status \"07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de\": rpc error: code = NotFound desc = could not find container \"07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de\": container with ID starting with 07e208283c934c9aa53e187bf6cb012aeb5475ce970b9ce61586c694ae97b5de not found: ID does not exist" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.149191 4764 scope.go:117] "RemoveContainer" containerID="b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442" Mar 20 15:12:47 crc kubenswrapper[4764]: E0320 15:12:47.149979 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442\": container with ID starting with b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442 not found: ID does not exist" containerID="b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.150017 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442"} err="failed to get container status \"b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442\": rpc error: code = NotFound desc = could not find container \"b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442\": container with ID starting with b88671d7051154a3f53d4f50b04b361500f482776bb6ede5f305d2e9695ee442 not found: ID does not exist" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.164599 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.247001 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.402589 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.423844 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.444315 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:12:47 crc kubenswrapper[4764]: E0320 15:12:47.444670 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03615c4-2b7b-4db8-8706-c47f8399c808" containerName="glance-httpd" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.444686 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03615c4-2b7b-4db8-8706-c47f8399c808" containerName="glance-httpd" Mar 20 15:12:47 crc kubenswrapper[4764]: E0320 15:12:47.444712 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03615c4-2b7b-4db8-8706-c47f8399c808" containerName="glance-log" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.444719 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03615c4-2b7b-4db8-8706-c47f8399c808" containerName="glance-log" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.444891 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f03615c4-2b7b-4db8-8706-c47f8399c808" containerName="glance-httpd" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.444913 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f03615c4-2b7b-4db8-8706-c47f8399c808" containerName="glance-log" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.445753 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.450672 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.450923 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.453794 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.552119 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.552160 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0def9a2d-c1f7-49a4-ae02-f32e54035e05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.552237 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0def9a2d-c1f7-49a4-ae02-f32e54035e05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.552263 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0def9a2d-c1f7-49a4-ae02-f32e54035e05-scripts\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.552288 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0def9a2d-c1f7-49a4-ae02-f32e54035e05-logs\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.552496 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0def9a2d-c1f7-49a4-ae02-f32e54035e05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.552550 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp547\" (UniqueName: \"kubernetes.io/projected/0def9a2d-c1f7-49a4-ae02-f32e54035e05-kube-api-access-xp547\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.552580 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0def9a2d-c1f7-49a4-ae02-f32e54035e05-config-data\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.654046 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0def9a2d-c1f7-49a4-ae02-f32e54035e05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.654312 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp547\" (UniqueName: \"kubernetes.io/projected/0def9a2d-c1f7-49a4-ae02-f32e54035e05-kube-api-access-xp547\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.654331 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0def9a2d-c1f7-49a4-ae02-f32e54035e05-config-data\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.654392 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.654416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0def9a2d-c1f7-49a4-ae02-f32e54035e05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.654481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0def9a2d-c1f7-49a4-ae02-f32e54035e05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.654507 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0def9a2d-c1f7-49a4-ae02-f32e54035e05-scripts\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.654529 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0def9a2d-c1f7-49a4-ae02-f32e54035e05-logs\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.654930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0def9a2d-c1f7-49a4-ae02-f32e54035e05-logs\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.655687 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.656860 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0def9a2d-c1f7-49a4-ae02-f32e54035e05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.678034 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0def9a2d-c1f7-49a4-ae02-f32e54035e05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.678118 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0def9a2d-c1f7-49a4-ae02-f32e54035e05-scripts\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.680937 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0def9a2d-c1f7-49a4-ae02-f32e54035e05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.693058 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp547\" (UniqueName: \"kubernetes.io/projected/0def9a2d-c1f7-49a4-ae02-f32e54035e05-kube-api-access-xp547\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.700465 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0def9a2d-c1f7-49a4-ae02-f32e54035e05-config-data\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.712720 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"0def9a2d-c1f7-49a4-ae02-f32e54035e05\") " pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.776522 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.851904 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.960041 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.960443 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-internal-tls-certs\") pod \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.960500 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-scripts\") pod \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.960538 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-combined-ca-bundle\") pod \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.960571 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-config-data\") pod \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.960613 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-logs\") pod \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.960632 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-httpd-run\") pod \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.960651 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcszk\" (UniqueName: \"kubernetes.io/projected/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-kube-api-access-mcszk\") pod \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\" (UID: \"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68\") " Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.962044 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-logs" (OuterVolumeSpecName: "logs") pod "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" (UID: "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.962371 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" (UID: "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.973462 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" (UID: "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.974445 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-kube-api-access-mcszk" (OuterVolumeSpecName: "kube-api-access-mcszk") pod "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" (UID: "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68"). InnerVolumeSpecName "kube-api-access-mcszk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:47 crc kubenswrapper[4764]: I0320 15:12:47.976477 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-scripts" (OuterVolumeSpecName: "scripts") pod "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" (UID: "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.062814 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.062839 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.062848 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.062857 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcszk\" (UniqueName: \"kubernetes.io/projected/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-kube-api-access-mcszk\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.062880 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.100953 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4","Type":"ContainerStarted","Data":"895db51484a923ee809288e162e0be2eac93c91c213eaa159ab1eb878571f904"} Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.115678 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" (UID: "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.119285 4764 generic.go:334] "Generic (PLEG): container finished" podID="5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" containerID="111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b" exitCode=0 Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.119337 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68","Type":"ContainerDied","Data":"111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b"} Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.119366 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5d39b3c2-aff3-4c9f-8e01-737c7ad92d68","Type":"ContainerDied","Data":"82b07b19fa509096697ac2c0f8c8ab55c5820307bec6afaa4469e49f87f293a9"} Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.119395 4764 scope.go:117] "RemoveContainer" containerID="111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.119517 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.125792 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.140396 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" (UID: "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.145395 4764 scope.go:117] "RemoveContainer" containerID="4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.165005 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.165315 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.165373 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.166802 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-config-data" (OuterVolumeSpecName: "config-data") pod "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" (UID: "5d39b3c2-aff3-4c9f-8e01-737c7ad92d68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.183655 4764 scope.go:117] "RemoveContainer" containerID="111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b" Mar 20 15:12:48 crc kubenswrapper[4764]: E0320 15:12:48.184513 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b\": container with ID starting with 111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b not found: ID does not exist" containerID="111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.184569 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b"} err="failed to get container status \"111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b\": rpc error: code = NotFound desc = could not find container \"111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b\": container with ID starting with 111e399bea667a40b8b27a14411b600d23e040fd993dbd9a67189f69bfc4323b not found: ID does not exist" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.184609 4764 scope.go:117] "RemoveContainer" containerID="4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b" Mar 20 15:12:48 crc kubenswrapper[4764]: E0320 15:12:48.184961 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b\": container with ID starting with 4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b not found: ID does not exist" containerID="4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.184981 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b"} err="failed to get container status \"4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b\": rpc error: code = NotFound desc = could not find container \"4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b\": container with ID starting with 4f7b569224b0dccb0151978533d146576c7d0a46a43aaf8bee443e90f84ca26b not found: ID does not exist" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.266929 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:48 crc kubenswrapper[4764]: W0320 15:12:48.360024 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0def9a2d_c1f7_49a4_ae02_f32e54035e05.slice/crio-84e6cf68f7a6187fb9685b851358eb0ab82e6009386d893eb76610facc36b84c WatchSource:0}: Error finding container 84e6cf68f7a6187fb9685b851358eb0ab82e6009386d893eb76610facc36b84c: Status 404 returned error can't find the container with id 84e6cf68f7a6187fb9685b851358eb0ab82e6009386d893eb76610facc36b84c Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.361155 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.473072 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.483182 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.521870 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:12:48 crc kubenswrapper[4764]: E0320 15:12:48.522208 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" containerName="glance-log" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.522223 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" containerName="glance-log" Mar 20 15:12:48 crc kubenswrapper[4764]: E0320 15:12:48.522249 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" containerName="glance-httpd" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.522256 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" containerName="glance-httpd" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.522429 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" containerName="glance-httpd" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.522454 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" containerName="glance-log" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.523283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.525342 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.525637 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.610683 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.677582 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-logs\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.677660 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.677687 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.677711 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.677745 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.677773 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.677796 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfbd5\" (UniqueName: \"kubernetes.io/projected/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-kube-api-access-vfbd5\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.677818 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.779564 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.779630 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.779661 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.779688 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfbd5\" (UniqueName: \"kubernetes.io/projected/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-kube-api-access-vfbd5\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.779710 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.779754 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-logs\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.779808 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.779832 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.780435 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.780951 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.781184 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-logs\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.783950 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.790553 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.798607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.800825 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfbd5\" (UniqueName: \"kubernetes.io/projected/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-kube-api-access-vfbd5\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.804151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6abb1d10-04cf-4dc4-81e3-f5db6d8d545f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.821733 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f\") " pod="openstack/glance-default-internal-api-0" Mar 20 15:12:48 crc kubenswrapper[4764]: I0320 15:12:48.864307 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:49 crc kubenswrapper[4764]: I0320 15:12:49.150300 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d39b3c2-aff3-4c9f-8e01-737c7ad92d68" path="/var/lib/kubelet/pods/5d39b3c2-aff3-4c9f-8e01-737c7ad92d68/volumes" Mar 20 15:12:49 crc kubenswrapper[4764]: I0320 15:12:49.158736 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f03615c4-2b7b-4db8-8706-c47f8399c808" path="/var/lib/kubelet/pods/f03615c4-2b7b-4db8-8706-c47f8399c808/volumes" Mar 20 15:12:49 crc kubenswrapper[4764]: I0320 15:12:49.159508 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0def9a2d-c1f7-49a4-ae02-f32e54035e05","Type":"ContainerStarted","Data":"12dc87ed24b145d2cf54c1bbb47d6303493a31e1c3c009fe3e18bb1949182e23"} Mar 20 15:12:49 crc kubenswrapper[4764]: I0320 15:12:49.159530 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0def9a2d-c1f7-49a4-ae02-f32e54035e05","Type":"ContainerStarted","Data":"84e6cf68f7a6187fb9685b851358eb0ab82e6009386d893eb76610facc36b84c"} Mar 20 15:12:49 crc kubenswrapper[4764]: I0320 15:12:49.177999 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4","Type":"ContainerStarted","Data":"75251a47d65df28c8e4a7078e94eda9da9607376af2e76b18595cdad00e6cbf9"} Mar 20 15:12:49 crc kubenswrapper[4764]: I0320 15:12:49.262403 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56ccbd6c69-4c89q" Mar 20 15:12:49 crc kubenswrapper[4764]: I0320 15:12:49.395696 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b6dc455f6-dhtvx"] Mar 20 15:12:49 crc kubenswrapper[4764]: I0320 15:12:49.396016 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b6dc455f6-dhtvx" podUID="0a8f151d-8a23-42d9-90a1-65caade3b03e" containerName="neutron-api" containerID="cri-o://87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283" gracePeriod=30 Mar 20 15:12:49 crc kubenswrapper[4764]: I0320 15:12:49.396590 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b6dc455f6-dhtvx" podUID="0a8f151d-8a23-42d9-90a1-65caade3b03e" containerName="neutron-httpd" containerID="cri-o://934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a" gracePeriod=30 Mar 20 15:12:49 crc kubenswrapper[4764]: I0320 15:12:49.539740 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 15:12:49 crc kubenswrapper[4764]: E0320 15:12:49.812243 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a8f151d_8a23_42d9_90a1_65caade3b03e.slice/crio-conmon-934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a.scope\": RecentStats: unable to find data in memory cache]" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.131285 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.199449 4764 generic.go:334] "Generic (PLEG): container finished" podID="e532b989-f73c-49a1-b4f2-43322246a71e" containerID="8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41" exitCode=137 Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.199525 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66899c9d8-zh5gp" event={"ID":"e532b989-f73c-49a1-b4f2-43322246a71e","Type":"ContainerDied","Data":"8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41"} Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.199563 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66899c9d8-zh5gp" event={"ID":"e532b989-f73c-49a1-b4f2-43322246a71e","Type":"ContainerDied","Data":"a05d6c3300d06509b07f48e78be82b6babdff234c8e065151d6eeeed4f235ad7"} Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.199584 4764 scope.go:117] "RemoveContainer" containerID="c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.199750 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66899c9d8-zh5gp" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.221085 4764 generic.go:334] "Generic (PLEG): container finished" podID="0a8f151d-8a23-42d9-90a1-65caade3b03e" containerID="934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a" exitCode=0 Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.221237 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b6dc455f6-dhtvx" event={"ID":"0a8f151d-8a23-42d9-90a1-65caade3b03e","Type":"ContainerDied","Data":"934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a"} Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.256746 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0def9a2d-c1f7-49a4-ae02-f32e54035e05","Type":"ContainerStarted","Data":"f5eb91d15bd4fc7895709e19c510f07c322cd86d5bbcd6c1d286a0465524676f"} Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.264146 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f","Type":"ContainerStarted","Data":"97a07d5b531c0b2bf477fcd1508a7c0857ac61e59523d5cc1e6cfc463a31a94d"} Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.297571 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.29754828 podStartE2EDuration="3.29754828s" podCreationTimestamp="2026-03-20 15:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:50.282052867 +0000 UTC m=+1291.898242006" watchObservedRunningTime="2026-03-20 15:12:50.29754828 +0000 UTC m=+1291.913737409" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.326997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e532b989-f73c-49a1-b4f2-43322246a71e-logs\") pod \"e532b989-f73c-49a1-b4f2-43322246a71e\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.327202 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfk59\" (UniqueName: \"kubernetes.io/projected/e532b989-f73c-49a1-b4f2-43322246a71e-kube-api-access-bfk59\") pod \"e532b989-f73c-49a1-b4f2-43322246a71e\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.327230 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e532b989-f73c-49a1-b4f2-43322246a71e-config-data\") pod \"e532b989-f73c-49a1-b4f2-43322246a71e\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.327297 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-horizon-secret-key\") pod \"e532b989-f73c-49a1-b4f2-43322246a71e\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.327318 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-combined-ca-bundle\") pod \"e532b989-f73c-49a1-b4f2-43322246a71e\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.327334 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e532b989-f73c-49a1-b4f2-43322246a71e-scripts\") pod \"e532b989-f73c-49a1-b4f2-43322246a71e\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.327360 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-horizon-tls-certs\") pod \"e532b989-f73c-49a1-b4f2-43322246a71e\" (UID: \"e532b989-f73c-49a1-b4f2-43322246a71e\") " Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.330147 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e532b989-f73c-49a1-b4f2-43322246a71e-logs" (OuterVolumeSpecName: "logs") pod "e532b989-f73c-49a1-b4f2-43322246a71e" (UID: "e532b989-f73c-49a1-b4f2-43322246a71e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.334596 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e532b989-f73c-49a1-b4f2-43322246a71e" (UID: "e532b989-f73c-49a1-b4f2-43322246a71e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.336046 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e532b989-f73c-49a1-b4f2-43322246a71e-kube-api-access-bfk59" (OuterVolumeSpecName: "kube-api-access-bfk59") pod "e532b989-f73c-49a1-b4f2-43322246a71e" (UID: "e532b989-f73c-49a1-b4f2-43322246a71e"). InnerVolumeSpecName "kube-api-access-bfk59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.360496 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e532b989-f73c-49a1-b4f2-43322246a71e" (UID: "e532b989-f73c-49a1-b4f2-43322246a71e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.367119 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e532b989-f73c-49a1-b4f2-43322246a71e-config-data" (OuterVolumeSpecName: "config-data") pod "e532b989-f73c-49a1-b4f2-43322246a71e" (UID: "e532b989-f73c-49a1-b4f2-43322246a71e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.368933 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e532b989-f73c-49a1-b4f2-43322246a71e-scripts" (OuterVolumeSpecName: "scripts") pod "e532b989-f73c-49a1-b4f2-43322246a71e" (UID: "e532b989-f73c-49a1-b4f2-43322246a71e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.390448 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "e532b989-f73c-49a1-b4f2-43322246a71e" (UID: "e532b989-f73c-49a1-b4f2-43322246a71e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.425719 4764 scope.go:117] "RemoveContainer" containerID="8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.429730 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfk59\" (UniqueName: \"kubernetes.io/projected/e532b989-f73c-49a1-b4f2-43322246a71e-kube-api-access-bfk59\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.429754 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e532b989-f73c-49a1-b4f2-43322246a71e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.429764 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.429774 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.429786 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e532b989-f73c-49a1-b4f2-43322246a71e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.429799 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e532b989-f73c-49a1-b4f2-43322246a71e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.429810 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e532b989-f73c-49a1-b4f2-43322246a71e-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.446498 4764 scope.go:117] "RemoveContainer" containerID="c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12" Mar 20 15:12:50 crc kubenswrapper[4764]: E0320 15:12:50.447849 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12\": container with ID starting with c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12 not found: ID does not exist" containerID="c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.447879 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12"} err="failed to get container status \"c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12\": rpc error: code = NotFound desc = could not find container \"c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12\": container with ID starting with c9464dcbef069af0f337f304147385daa7baa0932b9d3bb31e18c304808b8e12 not found: ID does not exist" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.447900 4764 scope.go:117] "RemoveContainer" containerID="8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41" Mar 20 15:12:50 crc kubenswrapper[4764]: E0320 15:12:50.448131 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41\": container with ID starting with 8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41 not found: ID does not exist" containerID="8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.448155 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41"} err="failed to get container status \"8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41\": rpc error: code = NotFound desc = could not find container \"8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41\": container with ID starting with 8fb2c3bd4b0784d3f68efae0b02cedb0463b2263346860d75d741f7b0ec40f41 not found: ID does not exist" Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.618325 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66899c9d8-zh5gp"] Mar 20 15:12:50 crc kubenswrapper[4764]: I0320 15:12:50.635037 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-66899c9d8-zh5gp"] Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.134937 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" path="/var/lib/kubelet/pods/e532b989-f73c-49a1-b4f2-43322246a71e/volumes" Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.273932 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f","Type":"ContainerStarted","Data":"313d986dfc0350ea7ff3437fed25b33c486b3815a87b6e221729db6c871274f0"} Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.273975 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6abb1d10-04cf-4dc4-81e3-f5db6d8d545f","Type":"ContainerStarted","Data":"a51350fbf989f305fe9973430aecc44dec40c05ec40bfb5f9904de28cfa4a2ed"} Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.276296 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4","Type":"ContainerStarted","Data":"28f4b2b7d7a35ff08d9ada6f093e007ce43f6c4619311aca63b809dbe2e5e784"} Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.276449 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="ceilometer-central-agent" containerID="cri-o://aff10edcbcd528d8f56084515f232557c5b6a4baee39abbeb159f08144448006" gracePeriod=30 Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.276635 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.276677 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="proxy-httpd" containerID="cri-o://28f4b2b7d7a35ff08d9ada6f093e007ce43f6c4619311aca63b809dbe2e5e784" gracePeriod=30 Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.276736 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="sg-core" containerID="cri-o://75251a47d65df28c8e4a7078e94eda9da9607376af2e76b18595cdad00e6cbf9" gracePeriod=30 Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.276769 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="ceilometer-notification-agent" containerID="cri-o://895db51484a923ee809288e162e0be2eac93c91c213eaa159ab1eb878571f904" gracePeriod=30 Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.303508 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.303490302 podStartE2EDuration="3.303490302s" podCreationTimestamp="2026-03-20 15:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:12:51.294225779 +0000 UTC m=+1292.910414908" watchObservedRunningTime="2026-03-20 15:12:51.303490302 +0000 UTC m=+1292.919679431" Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.315253 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.380670387 podStartE2EDuration="6.31522287s" podCreationTimestamp="2026-03-20 15:12:45 +0000 UTC" firstStartedPulling="2026-03-20 15:12:46.032732011 +0000 UTC m=+1287.648921140" lastFinishedPulling="2026-03-20 15:12:49.967284494 +0000 UTC m=+1291.583473623" observedRunningTime="2026-03-20 15:12:51.313862919 +0000 UTC m=+1292.930052048" watchObservedRunningTime="2026-03-20 15:12:51.31522287 +0000 UTC m=+1292.931411999" Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.734437 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:51 crc kubenswrapper[4764]: I0320 15:12:51.751019 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7957c5bfd5-lfmpr" Mar 20 15:12:52 crc kubenswrapper[4764]: I0320 15:12:52.290952 4764 generic.go:334] "Generic (PLEG): container finished" podID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerID="28f4b2b7d7a35ff08d9ada6f093e007ce43f6c4619311aca63b809dbe2e5e784" exitCode=0 Mar 20 15:12:52 crc kubenswrapper[4764]: I0320 15:12:52.290983 4764 generic.go:334] "Generic (PLEG): container finished" podID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerID="75251a47d65df28c8e4a7078e94eda9da9607376af2e76b18595cdad00e6cbf9" exitCode=2 Mar 20 15:12:52 crc kubenswrapper[4764]: I0320 15:12:52.290991 4764 generic.go:334] "Generic (PLEG): container finished" podID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerID="895db51484a923ee809288e162e0be2eac93c91c213eaa159ab1eb878571f904" exitCode=0 Mar 20 15:12:52 crc kubenswrapper[4764]: I0320 15:12:52.291019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4","Type":"ContainerDied","Data":"28f4b2b7d7a35ff08d9ada6f093e007ce43f6c4619311aca63b809dbe2e5e784"} Mar 20 15:12:52 crc kubenswrapper[4764]: I0320 15:12:52.291074 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4","Type":"ContainerDied","Data":"75251a47d65df28c8e4a7078e94eda9da9607376af2e76b18595cdad00e6cbf9"} Mar 20 15:12:52 crc kubenswrapper[4764]: I0320 15:12:52.291088 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4","Type":"ContainerDied","Data":"895db51484a923ee809288e162e0be2eac93c91c213eaa159ab1eb878571f904"} Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.222246 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.318766 4764 generic.go:334] "Generic (PLEG): container finished" podID="0a8f151d-8a23-42d9-90a1-65caade3b03e" containerID="87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283" exitCode=0 Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.318941 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b6dc455f6-dhtvx" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.319007 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b6dc455f6-dhtvx" event={"ID":"0a8f151d-8a23-42d9-90a1-65caade3b03e","Type":"ContainerDied","Data":"87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283"} Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.319870 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b6dc455f6-dhtvx" event={"ID":"0a8f151d-8a23-42d9-90a1-65caade3b03e","Type":"ContainerDied","Data":"fe7bb9439d51b2652644d0e4848439d8aff1dcf583a55a74a6421964363d7416"} Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.319892 4764 scope.go:117] "RemoveContainer" containerID="934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.321993 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-config\") pod \"0a8f151d-8a23-42d9-90a1-65caade3b03e\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.322071 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-httpd-config\") pod \"0a8f151d-8a23-42d9-90a1-65caade3b03e\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.322146 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54d59\" (UniqueName: \"kubernetes.io/projected/0a8f151d-8a23-42d9-90a1-65caade3b03e-kube-api-access-54d59\") pod \"0a8f151d-8a23-42d9-90a1-65caade3b03e\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.322232 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-ovndb-tls-certs\") pod \"0a8f151d-8a23-42d9-90a1-65caade3b03e\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.322264 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-combined-ca-bundle\") pod \"0a8f151d-8a23-42d9-90a1-65caade3b03e\" (UID: \"0a8f151d-8a23-42d9-90a1-65caade3b03e\") " Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.329270 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0a8f151d-8a23-42d9-90a1-65caade3b03e" (UID: "0a8f151d-8a23-42d9-90a1-65caade3b03e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.329853 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8f151d-8a23-42d9-90a1-65caade3b03e-kube-api-access-54d59" (OuterVolumeSpecName: "kube-api-access-54d59") pod "0a8f151d-8a23-42d9-90a1-65caade3b03e" (UID: "0a8f151d-8a23-42d9-90a1-65caade3b03e"). InnerVolumeSpecName "kube-api-access-54d59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.332919 4764 generic.go:334] "Generic (PLEG): container finished" podID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerID="aff10edcbcd528d8f56084515f232557c5b6a4baee39abbeb159f08144448006" exitCode=0 Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.332959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4","Type":"ContainerDied","Data":"aff10edcbcd528d8f56084515f232557c5b6a4baee39abbeb159f08144448006"} Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.393198 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a8f151d-8a23-42d9-90a1-65caade3b03e" (UID: "0a8f151d-8a23-42d9-90a1-65caade3b03e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.397003 4764 scope.go:117] "RemoveContainer" containerID="87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.410422 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0a8f151d-8a23-42d9-90a1-65caade3b03e" (UID: "0a8f151d-8a23-42d9-90a1-65caade3b03e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.412693 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-config" (OuterVolumeSpecName: "config") pod "0a8f151d-8a23-42d9-90a1-65caade3b03e" (UID: "0a8f151d-8a23-42d9-90a1-65caade3b03e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.420306 4764 scope.go:117] "RemoveContainer" containerID="934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a" Mar 20 15:12:55 crc kubenswrapper[4764]: E0320 15:12:55.420661 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a\": container with ID starting with 934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a not found: ID does not exist" containerID="934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.420691 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a"} err="failed to get container status \"934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a\": rpc error: code = NotFound desc = could not find container \"934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a\": container with ID starting with 934a86adc28645c1a81ff48a0bba0d85ec21d8285af79c8106cb8237cae4173a not found: ID does not exist" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.420711 4764 scope.go:117] "RemoveContainer" containerID="87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283" Mar 20 15:12:55 crc kubenswrapper[4764]: E0320 15:12:55.422009 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283\": container with ID starting with 87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283 not found: ID does not exist" containerID="87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.422035 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283"} err="failed to get container status \"87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283\": rpc error: code = NotFound desc = could not find container \"87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283\": container with ID starting with 87af05f91fa3630cd998fdfc49214c742baa14b04ad52e9db96e2d6733e21283 not found: ID does not exist" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.424697 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.424729 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.424741 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54d59\" (UniqueName: \"kubernetes.io/projected/0a8f151d-8a23-42d9-90a1-65caade3b03e-kube-api-access-54d59\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.424752 4764 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.424762 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8f151d-8a23-42d9-90a1-65caade3b03e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.433283 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.525849 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-scripts\") pod \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.525905 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-sg-core-conf-yaml\") pod \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.525995 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-run-httpd\") pod \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.526097 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-combined-ca-bundle\") pod \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.526178 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7t8z\" (UniqueName: \"kubernetes.io/projected/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-kube-api-access-k7t8z\") pod \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.526197 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-config-data\") pod \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.526244 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-log-httpd\") pod \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\" (UID: \"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4\") " Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.526371 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" (UID: "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.526797 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" (UID: "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.527012 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.527027 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.546007 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-scripts" (OuterVolumeSpecName: "scripts") pod "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" (UID: "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.552519 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-kube-api-access-k7t8z" (OuterVolumeSpecName: "kube-api-access-k7t8z") pod "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" (UID: "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4"). InnerVolumeSpecName "kube-api-access-k7t8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.563665 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" (UID: "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.628905 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7t8z\" (UniqueName: \"kubernetes.io/projected/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-kube-api-access-k7t8z\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.628936 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.628947 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.706744 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" (UID: "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.729742 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-config-data" (OuterVolumeSpecName: "config-data") pod "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" (UID: "27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.730763 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.730780 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.787672 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b6dc455f6-dhtvx"] Mar 20 15:12:55 crc kubenswrapper[4764]: I0320 15:12:55.794135 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5b6dc455f6-dhtvx"] Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.341435 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4","Type":"ContainerDied","Data":"0637e229b86530d6f27f8246e9157bdd28fbfe75b3ba8b122a913131d8540601"} Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.341470 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.341492 4764 scope.go:117] "RemoveContainer" containerID="28f4b2b7d7a35ff08d9ada6f093e007ce43f6c4619311aca63b809dbe2e5e784" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.365150 4764 scope.go:117] "RemoveContainer" containerID="75251a47d65df28c8e4a7078e94eda9da9607376af2e76b18595cdad00e6cbf9" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.376697 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.382944 4764 scope.go:117] "RemoveContainer" containerID="895db51484a923ee809288e162e0be2eac93c91c213eaa159ab1eb878571f904" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.390082 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.403795 4764 scope.go:117] "RemoveContainer" containerID="aff10edcbcd528d8f56084515f232557c5b6a4baee39abbeb159f08144448006" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.409825 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:56 crc kubenswrapper[4764]: E0320 15:12:56.410240 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8f151d-8a23-42d9-90a1-65caade3b03e" containerName="neutron-api" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410261 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8f151d-8a23-42d9-90a1-65caade3b03e" containerName="neutron-api" Mar 20 15:12:56 crc kubenswrapper[4764]: E0320 15:12:56.410270 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="ceilometer-notification-agent" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410278 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="ceilometer-notification-agent" Mar 20 15:12:56 crc kubenswrapper[4764]: E0320 15:12:56.410291 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" containerName="horizon-log" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410296 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" containerName="horizon-log" Mar 20 15:12:56 crc kubenswrapper[4764]: E0320 15:12:56.410308 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="ceilometer-central-agent" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410314 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="ceilometer-central-agent" Mar 20 15:12:56 crc kubenswrapper[4764]: E0320 15:12:56.410322 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="sg-core" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410328 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="sg-core" Mar 20 15:12:56 crc kubenswrapper[4764]: E0320 15:12:56.410343 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="proxy-httpd" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410348 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="proxy-httpd" Mar 20 15:12:56 crc kubenswrapper[4764]: E0320 15:12:56.410363 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8f151d-8a23-42d9-90a1-65caade3b03e" containerName="neutron-httpd" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410368 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8f151d-8a23-42d9-90a1-65caade3b03e" containerName="neutron-httpd" Mar 20 15:12:56 crc kubenswrapper[4764]: E0320 15:12:56.410381 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" containerName="horizon" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410387 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" containerName="horizon" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410559 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" containerName="horizon" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410571 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8f151d-8a23-42d9-90a1-65caade3b03e" containerName="neutron-api" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410581 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8f151d-8a23-42d9-90a1-65caade3b03e" containerName="neutron-httpd" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410595 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="sg-core" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410606 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="proxy-httpd" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410619 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="ceilometer-notification-agent" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410628 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e532b989-f73c-49a1-b4f2-43322246a71e" containerName="horizon-log" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.410637 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" containerName="ceilometer-central-agent" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.412122 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.416046 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.416334 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.467069 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.542970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-config-data\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.543035 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7f6z\" (UniqueName: \"kubernetes.io/projected/d58ac244-171d-4fd1-bfca-12a13315defd-kube-api-access-w7f6z\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.543077 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-scripts\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.543130 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d58ac244-171d-4fd1-bfca-12a13315defd-log-httpd\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.543166 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.543216 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.543256 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d58ac244-171d-4fd1-bfca-12a13315defd-run-httpd\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.645001 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.645059 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d58ac244-171d-4fd1-bfca-12a13315defd-run-httpd\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.645124 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-config-data\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.645145 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7f6z\" (UniqueName: \"kubernetes.io/projected/d58ac244-171d-4fd1-bfca-12a13315defd-kube-api-access-w7f6z\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.645176 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-scripts\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.645214 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d58ac244-171d-4fd1-bfca-12a13315defd-log-httpd\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.645238 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.645642 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d58ac244-171d-4fd1-bfca-12a13315defd-log-httpd\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.646001 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d58ac244-171d-4fd1-bfca-12a13315defd-run-httpd\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.649231 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.649548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-config-data\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.651811 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.652385 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-scripts\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.665682 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7f6z\" (UniqueName: \"kubernetes.io/projected/d58ac244-171d-4fd1-bfca-12a13315defd-kube-api-access-w7f6z\") pod \"ceilometer-0\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " pod="openstack/ceilometer-0" Mar 20 15:12:56 crc kubenswrapper[4764]: I0320 15:12:56.734224 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:12:57 crc kubenswrapper[4764]: I0320 15:12:57.136204 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8f151d-8a23-42d9-90a1-65caade3b03e" path="/var/lib/kubelet/pods/0a8f151d-8a23-42d9-90a1-65caade3b03e/volumes" Mar 20 15:12:57 crc kubenswrapper[4764]: I0320 15:12:57.137197 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4" path="/var/lib/kubelet/pods/27f3f77d-0f48-4f9a-8055-8d6f7bef0ca4/volumes" Mar 20 15:12:57 crc kubenswrapper[4764]: I0320 15:12:57.205140 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:57 crc kubenswrapper[4764]: I0320 15:12:57.350828 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d58ac244-171d-4fd1-bfca-12a13315defd","Type":"ContainerStarted","Data":"109b71cddf38c7eb36bffddde9010ff840cda217e4b74232ad0a21a3ccbe33bf"} Mar 20 15:12:57 crc kubenswrapper[4764]: I0320 15:12:57.778188 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 15:12:57 crc kubenswrapper[4764]: I0320 15:12:57.778246 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 15:12:57 crc kubenswrapper[4764]: I0320 15:12:57.820489 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 15:12:57 crc kubenswrapper[4764]: I0320 15:12:57.825877 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 15:12:58 crc kubenswrapper[4764]: I0320 15:12:58.364631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d58ac244-171d-4fd1-bfca-12a13315defd","Type":"ContainerStarted","Data":"0d90fa56c1d74b346d6f50a86b5c83b062a0d2592e856e3737dc0f2b1211143d"} Mar 20 15:12:58 crc kubenswrapper[4764]: I0320 15:12:58.365601 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 15:12:58 crc kubenswrapper[4764]: I0320 15:12:58.365947 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 15:12:58 crc kubenswrapper[4764]: I0320 15:12:58.389705 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:12:58 crc kubenswrapper[4764]: I0320 15:12:58.865638 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:58 crc kubenswrapper[4764]: I0320 15:12:58.866222 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:58 crc kubenswrapper[4764]: I0320 15:12:58.932992 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:58 crc kubenswrapper[4764]: I0320 15:12:58.994837 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:58 crc kubenswrapper[4764]: I0320 15:12:58.999933 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64c4774c48-rhcrn" Mar 20 15:12:59 crc kubenswrapper[4764]: I0320 15:12:59.005441 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:59 crc kubenswrapper[4764]: I0320 15:12:59.104954 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6bdffd5796-4724r"] Mar 20 15:12:59 crc kubenswrapper[4764]: I0320 15:12:59.105487 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6bdffd5796-4724r" podUID="6ac36915-9830-4e2c-871c-d8b56e780587" containerName="placement-log" containerID="cri-o://c0a5f6cb0134ea2c57541759e37b1f4118d99daab52ac20c28ed4d50a48e572e" gracePeriod=30 Mar 20 15:12:59 crc kubenswrapper[4764]: I0320 15:12:59.105692 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6bdffd5796-4724r" podUID="6ac36915-9830-4e2c-871c-d8b56e780587" containerName="placement-api" containerID="cri-o://1d19694c192b43ec439c344a64633b37c479312729679e189d629323589abd8a" gracePeriod=30 Mar 20 15:12:59 crc kubenswrapper[4764]: I0320 15:12:59.407830 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d58ac244-171d-4fd1-bfca-12a13315defd","Type":"ContainerStarted","Data":"6a7e975d5d96646270ac44cf2b88066ed9a4cd90473ee03382abf176e1f48f80"} Mar 20 15:12:59 crc kubenswrapper[4764]: I0320 15:12:59.424541 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ac36915-9830-4e2c-871c-d8b56e780587" containerID="c0a5f6cb0134ea2c57541759e37b1f4118d99daab52ac20c28ed4d50a48e572e" exitCode=143 Mar 20 15:12:59 crc kubenswrapper[4764]: I0320 15:12:59.424677 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdffd5796-4724r" event={"ID":"6ac36915-9830-4e2c-871c-d8b56e780587","Type":"ContainerDied","Data":"c0a5f6cb0134ea2c57541759e37b1f4118d99daab52ac20c28ed4d50a48e572e"} Mar 20 15:12:59 crc kubenswrapper[4764]: I0320 15:12:59.426436 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 15:12:59 crc kubenswrapper[4764]: I0320 15:12:59.427353 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.433825 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.434306 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.434191 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d58ac244-171d-4fd1-bfca-12a13315defd","Type":"ContainerStarted","Data":"f4c52e5ec68f4e622d08b39821abf77e68e81d8d3cf0b678b3429613f704d1fa"} Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.763945 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-g8f4s"] Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.764991 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-g8f4s" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.773418 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.780986 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.781606 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-g8f4s"] Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.839505 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgd55\" (UniqueName: \"kubernetes.io/projected/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6-kube-api-access-cgd55\") pod \"nova-api-db-create-g8f4s\" (UID: \"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6\") " pod="openstack/nova-api-db-create-g8f4s" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.839553 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6-operator-scripts\") pod \"nova-api-db-create-g8f4s\" (UID: \"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6\") " pod="openstack/nova-api-db-create-g8f4s" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.881928 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-nwrmx"] Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.883444 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nwrmx" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.901022 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nwrmx"] Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.941031 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgd55\" (UniqueName: \"kubernetes.io/projected/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6-kube-api-access-cgd55\") pod \"nova-api-db-create-g8f4s\" (UID: \"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6\") " pod="openstack/nova-api-db-create-g8f4s" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.941084 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6-operator-scripts\") pod \"nova-api-db-create-g8f4s\" (UID: \"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6\") " pod="openstack/nova-api-db-create-g8f4s" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.942050 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6-operator-scripts\") pod \"nova-api-db-create-g8f4s\" (UID: \"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6\") " pod="openstack/nova-api-db-create-g8f4s" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.976045 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgd55\" (UniqueName: \"kubernetes.io/projected/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6-kube-api-access-cgd55\") pod \"nova-api-db-create-g8f4s\" (UID: \"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6\") " pod="openstack/nova-api-db-create-g8f4s" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.996185 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ffae-account-create-update-vn59x"] Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.997580 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ffae-account-create-update-vn59x" Mar 20 15:13:00 crc kubenswrapper[4764]: I0320 15:13:00.999655 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.013130 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ffae-account-create-update-vn59x"] Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.042239 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e806e8a-b9b4-47eb-bd27-a70a53705c32-operator-scripts\") pod \"nova-cell0-db-create-nwrmx\" (UID: \"7e806e8a-b9b4-47eb-bd27-a70a53705c32\") " pod="openstack/nova-cell0-db-create-nwrmx" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.042300 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r89b\" (UniqueName: \"kubernetes.io/projected/7e806e8a-b9b4-47eb-bd27-a70a53705c32-kube-api-access-2r89b\") pod \"nova-cell0-db-create-nwrmx\" (UID: \"7e806e8a-b9b4-47eb-bd27-a70a53705c32\") " pod="openstack/nova-cell0-db-create-nwrmx" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.069517 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-9jbjr"] Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.070762 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9jbjr" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.077075 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9jbjr"] Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.083070 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-g8f4s" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.147789 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e806e8a-b9b4-47eb-bd27-a70a53705c32-operator-scripts\") pod \"nova-cell0-db-create-nwrmx\" (UID: \"7e806e8a-b9b4-47eb-bd27-a70a53705c32\") " pod="openstack/nova-cell0-db-create-nwrmx" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.148686 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e806e8a-b9b4-47eb-bd27-a70a53705c32-operator-scripts\") pod \"nova-cell0-db-create-nwrmx\" (UID: \"7e806e8a-b9b4-47eb-bd27-a70a53705c32\") " pod="openstack/nova-cell0-db-create-nwrmx" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.148876 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv56s\" (UniqueName: \"kubernetes.io/projected/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3-kube-api-access-dv56s\") pod \"nova-api-ffae-account-create-update-vn59x\" (UID: \"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3\") " pod="openstack/nova-api-ffae-account-create-update-vn59x" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.148994 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r89b\" (UniqueName: \"kubernetes.io/projected/7e806e8a-b9b4-47eb-bd27-a70a53705c32-kube-api-access-2r89b\") pod \"nova-cell0-db-create-nwrmx\" (UID: \"7e806e8a-b9b4-47eb-bd27-a70a53705c32\") " pod="openstack/nova-cell0-db-create-nwrmx" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.149135 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmx4l\" (UniqueName: \"kubernetes.io/projected/318faba6-6466-4f38-8f29-b01709e93bea-kube-api-access-hmx4l\") pod \"nova-cell1-db-create-9jbjr\" (UID: \"318faba6-6466-4f38-8f29-b01709e93bea\") " pod="openstack/nova-cell1-db-create-9jbjr" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.149295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3-operator-scripts\") pod \"nova-api-ffae-account-create-update-vn59x\" (UID: \"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3\") " pod="openstack/nova-api-ffae-account-create-update-vn59x" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.149361 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/318faba6-6466-4f38-8f29-b01709e93bea-operator-scripts\") pod \"nova-cell1-db-create-9jbjr\" (UID: \"318faba6-6466-4f38-8f29-b01709e93bea\") " pod="openstack/nova-cell1-db-create-9jbjr" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.170772 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r89b\" (UniqueName: \"kubernetes.io/projected/7e806e8a-b9b4-47eb-bd27-a70a53705c32-kube-api-access-2r89b\") pod \"nova-cell0-db-create-nwrmx\" (UID: \"7e806e8a-b9b4-47eb-bd27-a70a53705c32\") " pod="openstack/nova-cell0-db-create-nwrmx" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.189384 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-38d4-account-create-update-4zn9m"] Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.191792 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.203767 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.213061 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-38d4-account-create-update-4zn9m"] Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.213448 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nwrmx" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.253005 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv56s\" (UniqueName: \"kubernetes.io/projected/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3-kube-api-access-dv56s\") pod \"nova-api-ffae-account-create-update-vn59x\" (UID: \"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3\") " pod="openstack/nova-api-ffae-account-create-update-vn59x" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.253338 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmx4l\" (UniqueName: \"kubernetes.io/projected/318faba6-6466-4f38-8f29-b01709e93bea-kube-api-access-hmx4l\") pod \"nova-cell1-db-create-9jbjr\" (UID: \"318faba6-6466-4f38-8f29-b01709e93bea\") " pod="openstack/nova-cell1-db-create-9jbjr" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.253445 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3-operator-scripts\") pod \"nova-api-ffae-account-create-update-vn59x\" (UID: \"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3\") " pod="openstack/nova-api-ffae-account-create-update-vn59x" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.253462 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/318faba6-6466-4f38-8f29-b01709e93bea-operator-scripts\") pod \"nova-cell1-db-create-9jbjr\" (UID: \"318faba6-6466-4f38-8f29-b01709e93bea\") " pod="openstack/nova-cell1-db-create-9jbjr" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.254135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/318faba6-6466-4f38-8f29-b01709e93bea-operator-scripts\") pod \"nova-cell1-db-create-9jbjr\" (UID: \"318faba6-6466-4f38-8f29-b01709e93bea\") " pod="openstack/nova-cell1-db-create-9jbjr" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.258893 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3-operator-scripts\") pod \"nova-api-ffae-account-create-update-vn59x\" (UID: \"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3\") " pod="openstack/nova-api-ffae-account-create-update-vn59x" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.274650 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv56s\" (UniqueName: \"kubernetes.io/projected/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3-kube-api-access-dv56s\") pod \"nova-api-ffae-account-create-update-vn59x\" (UID: \"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3\") " pod="openstack/nova-api-ffae-account-create-update-vn59x" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.277434 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmx4l\" (UniqueName: \"kubernetes.io/projected/318faba6-6466-4f38-8f29-b01709e93bea-kube-api-access-hmx4l\") pod \"nova-cell1-db-create-9jbjr\" (UID: \"318faba6-6466-4f38-8f29-b01709e93bea\") " pod="openstack/nova-cell1-db-create-9jbjr" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.323210 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ffae-account-create-update-vn59x" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.355267 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sml2\" (UniqueName: \"kubernetes.io/projected/ce3ec590-49ca-4806-9a9e-5699c798e051-kube-api-access-4sml2\") pod \"nova-cell0-38d4-account-create-update-4zn9m\" (UID: \"ce3ec590-49ca-4806-9a9e-5699c798e051\") " pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.355430 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3ec590-49ca-4806-9a9e-5699c798e051-operator-scripts\") pod \"nova-cell0-38d4-account-create-update-4zn9m\" (UID: \"ce3ec590-49ca-4806-9a9e-5699c798e051\") " pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.377686 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cc83-account-create-update-rcg5p"] Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.379871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.382076 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.387757 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cc83-account-create-update-rcg5p"] Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.388420 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9jbjr" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.456504 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.456527 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.458293 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sml2\" (UniqueName: \"kubernetes.io/projected/ce3ec590-49ca-4806-9a9e-5699c798e051-kube-api-access-4sml2\") pod \"nova-cell0-38d4-account-create-update-4zn9m\" (UID: \"ce3ec590-49ca-4806-9a9e-5699c798e051\") " pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.458902 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e15c13f0-2f86-4aba-b109-bac20813f7c6-operator-scripts\") pod \"nova-cell1-cc83-account-create-update-rcg5p\" (UID: \"e15c13f0-2f86-4aba-b109-bac20813f7c6\") " pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.458974 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3ec590-49ca-4806-9a9e-5699c798e051-operator-scripts\") pod \"nova-cell0-38d4-account-create-update-4zn9m\" (UID: \"ce3ec590-49ca-4806-9a9e-5699c798e051\") " pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.459009 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs6cp\" (UniqueName: \"kubernetes.io/projected/e15c13f0-2f86-4aba-b109-bac20813f7c6-kube-api-access-rs6cp\") pod \"nova-cell1-cc83-account-create-update-rcg5p\" (UID: \"e15c13f0-2f86-4aba-b109-bac20813f7c6\") " pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.459905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3ec590-49ca-4806-9a9e-5699c798e051-operator-scripts\") pod \"nova-cell0-38d4-account-create-update-4zn9m\" (UID: \"ce3ec590-49ca-4806-9a9e-5699c798e051\") " pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.484030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sml2\" (UniqueName: \"kubernetes.io/projected/ce3ec590-49ca-4806-9a9e-5699c798e051-kube-api-access-4sml2\") pod \"nova-cell0-38d4-account-create-update-4zn9m\" (UID: \"ce3ec590-49ca-4806-9a9e-5699c798e051\") " pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.560368 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e15c13f0-2f86-4aba-b109-bac20813f7c6-operator-scripts\") pod \"nova-cell1-cc83-account-create-update-rcg5p\" (UID: \"e15c13f0-2f86-4aba-b109-bac20813f7c6\") " pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.560466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs6cp\" (UniqueName: \"kubernetes.io/projected/e15c13f0-2f86-4aba-b109-bac20813f7c6-kube-api-access-rs6cp\") pod \"nova-cell1-cc83-account-create-update-rcg5p\" (UID: \"e15c13f0-2f86-4aba-b109-bac20813f7c6\") " pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.562267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e15c13f0-2f86-4aba-b109-bac20813f7c6-operator-scripts\") pod \"nova-cell1-cc83-account-create-update-rcg5p\" (UID: \"e15c13f0-2f86-4aba-b109-bac20813f7c6\") " pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.568938 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.583313 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs6cp\" (UniqueName: \"kubernetes.io/projected/e15c13f0-2f86-4aba-b109-bac20813f7c6-kube-api-access-rs6cp\") pod \"nova-cell1-cc83-account-create-update-rcg5p\" (UID: \"e15c13f0-2f86-4aba-b109-bac20813f7c6\") " pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.594308 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-g8f4s"] Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.707562 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.729143 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.848942 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 15:13:01 crc kubenswrapper[4764]: I0320 15:13:01.850539 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nwrmx"] Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.061467 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ffae-account-create-update-vn59x"] Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.191337 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cc83-account-create-update-rcg5p"] Mar 20 15:13:02 crc kubenswrapper[4764]: W0320 15:13:02.235479 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode15c13f0_2f86_4aba_b109_bac20813f7c6.slice/crio-24ff090a09c59dd75121deecfc1a1ac278c506ccd883a15ad63d32c6521acbc2 WatchSource:0}: Error finding container 24ff090a09c59dd75121deecfc1a1ac278c506ccd883a15ad63d32c6521acbc2: Status 404 returned error can't find the container with id 24ff090a09c59dd75121deecfc1a1ac278c506ccd883a15ad63d32c6521acbc2 Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.263609 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9jbjr"] Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.295588 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-38d4-account-create-update-4zn9m"] Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.476655 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d58ac244-171d-4fd1-bfca-12a13315defd","Type":"ContainerStarted","Data":"2b2466b100fb8e8b55d8af9d420ca160e1b2d6c45cb7a1eb2789a7cde9ac60dc"} Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.476813 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="ceilometer-central-agent" containerID="cri-o://0d90fa56c1d74b346d6f50a86b5c83b062a0d2592e856e3737dc0f2b1211143d" gracePeriod=30 Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.477073 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.477314 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="proxy-httpd" containerID="cri-o://2b2466b100fb8e8b55d8af9d420ca160e1b2d6c45cb7a1eb2789a7cde9ac60dc" gracePeriod=30 Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.477354 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="sg-core" containerID="cri-o://f4c52e5ec68f4e622d08b39821abf77e68e81d8d3cf0b678b3429613f704d1fa" gracePeriod=30 Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.477410 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="ceilometer-notification-agent" containerID="cri-o://6a7e975d5d96646270ac44cf2b88066ed9a4cd90473ee03382abf176e1f48f80" gracePeriod=30 Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.491143 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-g8f4s" event={"ID":"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6","Type":"ContainerStarted","Data":"d792c22067c97259c38be96cc357702c6b4ad5618748f625c5a9e3dce4bdb43e"} Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.491181 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-g8f4s" event={"ID":"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6","Type":"ContainerStarted","Data":"1d5d87c8572c41b7ec676036269ee574ef6f4e46f534e8dc8d3282924cb8bbda"} Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.494045 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nwrmx" event={"ID":"7e806e8a-b9b4-47eb-bd27-a70a53705c32","Type":"ContainerStarted","Data":"e7412c1c9c97aa82b728a7d4953f865662928f42369a5ba8af59dd57390cbbf9"} Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.500035 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ffae-account-create-update-vn59x" event={"ID":"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3","Type":"ContainerStarted","Data":"229861780cfc3330f2498db827869a6c5d8b94e9396775178a50f43579deb5c0"} Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.505892 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" event={"ID":"e15c13f0-2f86-4aba-b109-bac20813f7c6","Type":"ContainerStarted","Data":"24ff090a09c59dd75121deecfc1a1ac278c506ccd883a15ad63d32c6521acbc2"} Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.513191 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" event={"ID":"ce3ec590-49ca-4806-9a9e-5699c798e051","Type":"ContainerStarted","Data":"d3eddf54155392c8d6b6821680ccd07ca8719263d56317d08f89c1610decc21d"} Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.516035 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.106359842 podStartE2EDuration="6.516014814s" podCreationTimestamp="2026-03-20 15:12:56 +0000 UTC" firstStartedPulling="2026-03-20 15:12:57.20584007 +0000 UTC m=+1298.822029209" lastFinishedPulling="2026-03-20 15:13:01.615495052 +0000 UTC m=+1303.231684181" observedRunningTime="2026-03-20 15:13:02.501820631 +0000 UTC m=+1304.118009760" watchObservedRunningTime="2026-03-20 15:13:02.516014814 +0000 UTC m=+1304.132203943" Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.518514 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9jbjr" event={"ID":"318faba6-6466-4f38-8f29-b01709e93bea","Type":"ContainerStarted","Data":"21087845c39cea04665422cad52e248e4e4eab28f1bcab17b51d4a344abdfd3a"} Mar 20 15:13:02 crc kubenswrapper[4764]: I0320 15:13:02.533560 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-g8f4s" podStartSLOduration=2.533526549 podStartE2EDuration="2.533526549s" podCreationTimestamp="2026-03-20 15:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:13:02.518667335 +0000 UTC m=+1304.134856464" watchObservedRunningTime="2026-03-20 15:13:02.533526549 +0000 UTC m=+1304.149715678" Mar 20 15:13:03 crc kubenswrapper[4764]: I0320 15:13:03.078321 4764 scope.go:117] "RemoveContainer" containerID="7a0aa33dbad07904ee592cf9e7e4bdb22f38f75749baff3de3ef3968975691e5" Mar 20 15:13:03 crc kubenswrapper[4764]: I0320 15:13:03.528873 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ac36915-9830-4e2c-871c-d8b56e780587" containerID="1d19694c192b43ec439c344a64633b37c479312729679e189d629323589abd8a" exitCode=0 Mar 20 15:13:03 crc kubenswrapper[4764]: I0320 15:13:03.528933 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdffd5796-4724r" event={"ID":"6ac36915-9830-4e2c-871c-d8b56e780587","Type":"ContainerDied","Data":"1d19694c192b43ec439c344a64633b37c479312729679e189d629323589abd8a"} Mar 20 15:13:03 crc kubenswrapper[4764]: I0320 15:13:03.533435 4764 generic.go:334] "Generic (PLEG): container finished" podID="d58ac244-171d-4fd1-bfca-12a13315defd" containerID="2b2466b100fb8e8b55d8af9d420ca160e1b2d6c45cb7a1eb2789a7cde9ac60dc" exitCode=0 Mar 20 15:13:03 crc kubenswrapper[4764]: I0320 15:13:03.533479 4764 generic.go:334] "Generic (PLEG): container finished" podID="d58ac244-171d-4fd1-bfca-12a13315defd" containerID="f4c52e5ec68f4e622d08b39821abf77e68e81d8d3cf0b678b3429613f704d1fa" exitCode=2 Mar 20 15:13:03 crc kubenswrapper[4764]: I0320 15:13:03.533496 4764 generic.go:334] "Generic (PLEG): container finished" podID="d58ac244-171d-4fd1-bfca-12a13315defd" containerID="6a7e975d5d96646270ac44cf2b88066ed9a4cd90473ee03382abf176e1f48f80" exitCode=0 Mar 20 15:13:03 crc kubenswrapper[4764]: I0320 15:13:03.533499 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d58ac244-171d-4fd1-bfca-12a13315defd","Type":"ContainerDied","Data":"2b2466b100fb8e8b55d8af9d420ca160e1b2d6c45cb7a1eb2789a7cde9ac60dc"} Mar 20 15:13:03 crc kubenswrapper[4764]: I0320 15:13:03.533534 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d58ac244-171d-4fd1-bfca-12a13315defd","Type":"ContainerDied","Data":"f4c52e5ec68f4e622d08b39821abf77e68e81d8d3cf0b678b3429613f704d1fa"} Mar 20 15:13:03 crc kubenswrapper[4764]: I0320 15:13:03.533549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d58ac244-171d-4fd1-bfca-12a13315defd","Type":"ContainerDied","Data":"6a7e975d5d96646270ac44cf2b88066ed9a4cd90473ee03382abf176e1f48f80"} Mar 20 15:13:04 crc kubenswrapper[4764]: I0320 15:13:04.544036 4764 generic.go:334] "Generic (PLEG): container finished" podID="fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6" containerID="d792c22067c97259c38be96cc357702c6b4ad5618748f625c5a9e3dce4bdb43e" exitCode=0 Mar 20 15:13:04 crc kubenswrapper[4764]: I0320 15:13:04.544104 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-g8f4s" event={"ID":"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6","Type":"ContainerDied","Data":"d792c22067c97259c38be96cc357702c6b4ad5618748f625c5a9e3dce4bdb43e"} Mar 20 15:13:04 crc kubenswrapper[4764]: I0320 15:13:04.546034 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nwrmx" event={"ID":"7e806e8a-b9b4-47eb-bd27-a70a53705c32","Type":"ContainerStarted","Data":"2fbbb178bcbeebed7d961ec717e54209cb5b795217f7bc857740b8d8b736d0c4"} Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.557978 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ffae-account-create-update-vn59x" event={"ID":"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3","Type":"ContainerStarted","Data":"348731f5dd646000d50072190415b5151b6adaaa6723bac653816c63f16a0ca0"} Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.560919 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" event={"ID":"e15c13f0-2f86-4aba-b109-bac20813f7c6","Type":"ContainerStarted","Data":"531116529d796233fe33036f960abaa83b9644b04cd24fd2c622c4c194ccc7b9"} Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.606321 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-nwrmx" podStartSLOduration=5.606301342 podStartE2EDuration="5.606301342s" podCreationTimestamp="2026-03-20 15:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:13:04.573019706 +0000 UTC m=+1306.189208855" watchObservedRunningTime="2026-03-20 15:13:05.606301342 +0000 UTC m=+1307.222490471" Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.606902 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-ffae-account-create-update-vn59x" podStartSLOduration=5.6068947300000005 podStartE2EDuration="5.60689473s" podCreationTimestamp="2026-03-20 15:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:13:05.586947601 +0000 UTC m=+1307.203136730" watchObservedRunningTime="2026-03-20 15:13:05.60689473 +0000 UTC m=+1307.223083859" Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.621714 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" podStartSLOduration=4.621682762 podStartE2EDuration="4.621682762s" podCreationTimestamp="2026-03-20 15:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:13:05.605703044 +0000 UTC m=+1307.221892163" watchObservedRunningTime="2026-03-20 15:13:05.621682762 +0000 UTC m=+1307.237871891" Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.821742 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.875224 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac36915-9830-4e2c-871c-d8b56e780587-logs\") pod \"6ac36915-9830-4e2c-871c-d8b56e780587\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.875336 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-public-tls-certs\") pod \"6ac36915-9830-4e2c-871c-d8b56e780587\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.875431 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-internal-tls-certs\") pod \"6ac36915-9830-4e2c-871c-d8b56e780587\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.875470 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfxb8\" (UniqueName: \"kubernetes.io/projected/6ac36915-9830-4e2c-871c-d8b56e780587-kube-api-access-jfxb8\") pod \"6ac36915-9830-4e2c-871c-d8b56e780587\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.875568 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-config-data\") pod \"6ac36915-9830-4e2c-871c-d8b56e780587\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.875629 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-combined-ca-bundle\") pod \"6ac36915-9830-4e2c-871c-d8b56e780587\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.875718 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-scripts\") pod \"6ac36915-9830-4e2c-871c-d8b56e780587\" (UID: \"6ac36915-9830-4e2c-871c-d8b56e780587\") " Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.876965 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ac36915-9830-4e2c-871c-d8b56e780587-logs" (OuterVolumeSpecName: "logs") pod "6ac36915-9830-4e2c-871c-d8b56e780587" (UID: "6ac36915-9830-4e2c-871c-d8b56e780587"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.893952 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-scripts" (OuterVolumeSpecName: "scripts") pod "6ac36915-9830-4e2c-871c-d8b56e780587" (UID: "6ac36915-9830-4e2c-871c-d8b56e780587"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.894562 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac36915-9830-4e2c-871c-d8b56e780587-kube-api-access-jfxb8" (OuterVolumeSpecName: "kube-api-access-jfxb8") pod "6ac36915-9830-4e2c-871c-d8b56e780587" (UID: "6ac36915-9830-4e2c-871c-d8b56e780587"). InnerVolumeSpecName "kube-api-access-jfxb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.977711 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.977743 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac36915-9830-4e2c-871c-d8b56e780587-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:05 crc kubenswrapper[4764]: I0320 15:13:05.977752 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfxb8\" (UniqueName: \"kubernetes.io/projected/6ac36915-9830-4e2c-871c-d8b56e780587-kube-api-access-jfxb8\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.003770 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-g8f4s" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.028787 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6ac36915-9830-4e2c-871c-d8b56e780587" (UID: "6ac36915-9830-4e2c-871c-d8b56e780587"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.062505 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6ac36915-9830-4e2c-871c-d8b56e780587" (UID: "6ac36915-9830-4e2c-871c-d8b56e780587"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.083492 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6-operator-scripts\") pod \"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6\" (UID: \"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6\") " Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.083875 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgd55\" (UniqueName: \"kubernetes.io/projected/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6-kube-api-access-cgd55\") pod \"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6\" (UID: \"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6\") " Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.085084 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.085108 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.086073 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6" (UID: "fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.119221 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6-kube-api-access-cgd55" (OuterVolumeSpecName: "kube-api-access-cgd55") pod "fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6" (UID: "fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6"). InnerVolumeSpecName "kube-api-access-cgd55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.124725 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-config-data" (OuterVolumeSpecName: "config-data") pod "6ac36915-9830-4e2c-871c-d8b56e780587" (UID: "6ac36915-9830-4e2c-871c-d8b56e780587"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.126813 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ac36915-9830-4e2c-871c-d8b56e780587" (UID: "6ac36915-9830-4e2c-871c-d8b56e780587"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.189270 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.189309 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac36915-9830-4e2c-871c-d8b56e780587-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.189322 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.189333 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgd55\" (UniqueName: \"kubernetes.io/projected/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6-kube-api-access-cgd55\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.571078 4764 generic.go:334] "Generic (PLEG): container finished" podID="7e806e8a-b9b4-47eb-bd27-a70a53705c32" containerID="2fbbb178bcbeebed7d961ec717e54209cb5b795217f7bc857740b8d8b736d0c4" exitCode=0 Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.571168 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nwrmx" event={"ID":"7e806e8a-b9b4-47eb-bd27-a70a53705c32","Type":"ContainerDied","Data":"2fbbb178bcbeebed7d961ec717e54209cb5b795217f7bc857740b8d8b736d0c4"} Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.577935 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6bdffd5796-4724r" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.577954 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdffd5796-4724r" event={"ID":"6ac36915-9830-4e2c-871c-d8b56e780587","Type":"ContainerDied","Data":"aa651dd5976b6201e5c33968256d526e6a79ab2b8dbaa07a6cc43993a4411ffd"} Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.577997 4764 scope.go:117] "RemoveContainer" containerID="1d19694c192b43ec439c344a64633b37c479312729679e189d629323589abd8a" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.580469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-g8f4s" event={"ID":"fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6","Type":"ContainerDied","Data":"1d5d87c8572c41b7ec676036269ee574ef6f4e46f534e8dc8d3282924cb8bbda"} Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.580495 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d5d87c8572c41b7ec676036269ee574ef6f4e46f534e8dc8d3282924cb8bbda" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.580579 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-g8f4s" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.589538 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" event={"ID":"ce3ec590-49ca-4806-9a9e-5699c798e051","Type":"ContainerStarted","Data":"61a5275ea2051cc4b1f273ae4bfbd7d011e22af1ade96cdafaa4ad36abe697fa"} Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.592759 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9jbjr" event={"ID":"318faba6-6466-4f38-8f29-b01709e93bea","Type":"ContainerStarted","Data":"f3229830082952e3dd130b86520b41db93101029258d5a49a565a96e4b17ad40"} Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.601598 4764 scope.go:117] "RemoveContainer" containerID="c0a5f6cb0134ea2c57541759e37b1f4118d99daab52ac20c28ed4d50a48e572e" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.620150 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" podStartSLOduration=5.620134464 podStartE2EDuration="5.620134464s" podCreationTimestamp="2026-03-20 15:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:13:06.616800003 +0000 UTC m=+1308.232989132" watchObservedRunningTime="2026-03-20 15:13:06.620134464 +0000 UTC m=+1308.236323593" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.647431 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-9jbjr" podStartSLOduration=5.647406188 podStartE2EDuration="5.647406188s" podCreationTimestamp="2026-03-20 15:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:13:06.633455372 +0000 UTC m=+1308.249644511" watchObservedRunningTime="2026-03-20 15:13:06.647406188 +0000 UTC m=+1308.263595327" Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.662630 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6bdffd5796-4724r"] Mar 20 15:13:06 crc kubenswrapper[4764]: I0320 15:13:06.686652 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6bdffd5796-4724r"] Mar 20 15:13:07 crc kubenswrapper[4764]: I0320 15:13:07.138942 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac36915-9830-4e2c-871c-d8b56e780587" path="/var/lib/kubelet/pods/6ac36915-9830-4e2c-871c-d8b56e780587/volumes" Mar 20 15:13:07 crc kubenswrapper[4764]: I0320 15:13:07.604524 4764 generic.go:334] "Generic (PLEG): container finished" podID="e15c13f0-2f86-4aba-b109-bac20813f7c6" containerID="531116529d796233fe33036f960abaa83b9644b04cd24fd2c622c4c194ccc7b9" exitCode=0 Mar 20 15:13:07 crc kubenswrapper[4764]: I0320 15:13:07.604634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" event={"ID":"e15c13f0-2f86-4aba-b109-bac20813f7c6","Type":"ContainerDied","Data":"531116529d796233fe33036f960abaa83b9644b04cd24fd2c622c4c194ccc7b9"} Mar 20 15:13:07 crc kubenswrapper[4764]: I0320 15:13:07.606631 4764 generic.go:334] "Generic (PLEG): container finished" podID="ce3ec590-49ca-4806-9a9e-5699c798e051" containerID="61a5275ea2051cc4b1f273ae4bfbd7d011e22af1ade96cdafaa4ad36abe697fa" exitCode=0 Mar 20 15:13:07 crc kubenswrapper[4764]: I0320 15:13:07.606671 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" event={"ID":"ce3ec590-49ca-4806-9a9e-5699c798e051","Type":"ContainerDied","Data":"61a5275ea2051cc4b1f273ae4bfbd7d011e22af1ade96cdafaa4ad36abe697fa"} Mar 20 15:13:07 crc kubenswrapper[4764]: I0320 15:13:07.608340 4764 generic.go:334] "Generic (PLEG): container finished" podID="318faba6-6466-4f38-8f29-b01709e93bea" containerID="f3229830082952e3dd130b86520b41db93101029258d5a49a565a96e4b17ad40" exitCode=0 Mar 20 15:13:07 crc kubenswrapper[4764]: I0320 15:13:07.608417 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9jbjr" event={"ID":"318faba6-6466-4f38-8f29-b01709e93bea","Type":"ContainerDied","Data":"f3229830082952e3dd130b86520b41db93101029258d5a49a565a96e4b17ad40"} Mar 20 15:13:07 crc kubenswrapper[4764]: I0320 15:13:07.611084 4764 generic.go:334] "Generic (PLEG): container finished" podID="cc6bf917-4144-4e4a-9a7a-63aae89d8ad3" containerID="348731f5dd646000d50072190415b5151b6adaaa6723bac653816c63f16a0ca0" exitCode=0 Mar 20 15:13:07 crc kubenswrapper[4764]: I0320 15:13:07.611140 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ffae-account-create-update-vn59x" event={"ID":"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3","Type":"ContainerDied","Data":"348731f5dd646000d50072190415b5151b6adaaa6723bac653816c63f16a0ca0"} Mar 20 15:13:07 crc kubenswrapper[4764]: I0320 15:13:07.975501 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nwrmx" Mar 20 15:13:08 crc kubenswrapper[4764]: I0320 15:13:08.020984 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e806e8a-b9b4-47eb-bd27-a70a53705c32-operator-scripts\") pod \"7e806e8a-b9b4-47eb-bd27-a70a53705c32\" (UID: \"7e806e8a-b9b4-47eb-bd27-a70a53705c32\") " Mar 20 15:13:08 crc kubenswrapper[4764]: I0320 15:13:08.021055 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r89b\" (UniqueName: \"kubernetes.io/projected/7e806e8a-b9b4-47eb-bd27-a70a53705c32-kube-api-access-2r89b\") pod \"7e806e8a-b9b4-47eb-bd27-a70a53705c32\" (UID: \"7e806e8a-b9b4-47eb-bd27-a70a53705c32\") " Mar 20 15:13:08 crc kubenswrapper[4764]: I0320 15:13:08.021944 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e806e8a-b9b4-47eb-bd27-a70a53705c32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e806e8a-b9b4-47eb-bd27-a70a53705c32" (UID: "7e806e8a-b9b4-47eb-bd27-a70a53705c32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:13:08 crc kubenswrapper[4764]: I0320 15:13:08.028661 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e806e8a-b9b4-47eb-bd27-a70a53705c32-kube-api-access-2r89b" (OuterVolumeSpecName: "kube-api-access-2r89b") pod "7e806e8a-b9b4-47eb-bd27-a70a53705c32" (UID: "7e806e8a-b9b4-47eb-bd27-a70a53705c32"). InnerVolumeSpecName "kube-api-access-2r89b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:08 crc kubenswrapper[4764]: I0320 15:13:08.123416 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e806e8a-b9b4-47eb-bd27-a70a53705c32-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:08 crc kubenswrapper[4764]: I0320 15:13:08.123452 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r89b\" (UniqueName: \"kubernetes.io/projected/7e806e8a-b9b4-47eb-bd27-a70a53705c32-kube-api-access-2r89b\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:08 crc kubenswrapper[4764]: I0320 15:13:08.627823 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nwrmx" event={"ID":"7e806e8a-b9b4-47eb-bd27-a70a53705c32","Type":"ContainerDied","Data":"e7412c1c9c97aa82b728a7d4953f865662928f42369a5ba8af59dd57390cbbf9"} Mar 20 15:13:08 crc kubenswrapper[4764]: I0320 15:13:08.627881 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7412c1c9c97aa82b728a7d4953f865662928f42369a5ba8af59dd57390cbbf9" Mar 20 15:13:08 crc kubenswrapper[4764]: I0320 15:13:08.628241 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nwrmx" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.067072 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9jbjr" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.163201 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/318faba6-6466-4f38-8f29-b01709e93bea-operator-scripts\") pod \"318faba6-6466-4f38-8f29-b01709e93bea\" (UID: \"318faba6-6466-4f38-8f29-b01709e93bea\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.163536 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmx4l\" (UniqueName: \"kubernetes.io/projected/318faba6-6466-4f38-8f29-b01709e93bea-kube-api-access-hmx4l\") pod \"318faba6-6466-4f38-8f29-b01709e93bea\" (UID: \"318faba6-6466-4f38-8f29-b01709e93bea\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.164062 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/318faba6-6466-4f38-8f29-b01709e93bea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "318faba6-6466-4f38-8f29-b01709e93bea" (UID: "318faba6-6466-4f38-8f29-b01709e93bea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.170625 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/318faba6-6466-4f38-8f29-b01709e93bea-kube-api-access-hmx4l" (OuterVolumeSpecName: "kube-api-access-hmx4l") pod "318faba6-6466-4f38-8f29-b01709e93bea" (UID: "318faba6-6466-4f38-8f29-b01709e93bea"). InnerVolumeSpecName "kube-api-access-hmx4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.208743 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.216044 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.226399 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ffae-account-create-update-vn59x" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.264742 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e15c13f0-2f86-4aba-b109-bac20813f7c6-operator-scripts\") pod \"e15c13f0-2f86-4aba-b109-bac20813f7c6\" (UID: \"e15c13f0-2f86-4aba-b109-bac20813f7c6\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.264807 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3ec590-49ca-4806-9a9e-5699c798e051-operator-scripts\") pod \"ce3ec590-49ca-4806-9a9e-5699c798e051\" (UID: \"ce3ec590-49ca-4806-9a9e-5699c798e051\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.264855 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3-operator-scripts\") pod \"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3\" (UID: \"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.264895 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sml2\" (UniqueName: \"kubernetes.io/projected/ce3ec590-49ca-4806-9a9e-5699c798e051-kube-api-access-4sml2\") pod \"ce3ec590-49ca-4806-9a9e-5699c798e051\" (UID: \"ce3ec590-49ca-4806-9a9e-5699c798e051\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.264937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv56s\" (UniqueName: \"kubernetes.io/projected/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3-kube-api-access-dv56s\") pod \"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3\" (UID: \"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.264952 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs6cp\" (UniqueName: \"kubernetes.io/projected/e15c13f0-2f86-4aba-b109-bac20813f7c6-kube-api-access-rs6cp\") pod \"e15c13f0-2f86-4aba-b109-bac20813f7c6\" (UID: \"e15c13f0-2f86-4aba-b109-bac20813f7c6\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.265647 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmx4l\" (UniqueName: \"kubernetes.io/projected/318faba6-6466-4f38-8f29-b01709e93bea-kube-api-access-hmx4l\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.265665 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/318faba6-6466-4f38-8f29-b01709e93bea-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.268154 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc6bf917-4144-4e4a-9a7a-63aae89d8ad3" (UID: "cc6bf917-4144-4e4a-9a7a-63aae89d8ad3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.268516 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e15c13f0-2f86-4aba-b109-bac20813f7c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e15c13f0-2f86-4aba-b109-bac20813f7c6" (UID: "e15c13f0-2f86-4aba-b109-bac20813f7c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.268563 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce3ec590-49ca-4806-9a9e-5699c798e051-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce3ec590-49ca-4806-9a9e-5699c798e051" (UID: "ce3ec590-49ca-4806-9a9e-5699c798e051"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.269652 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e15c13f0-2f86-4aba-b109-bac20813f7c6-kube-api-access-rs6cp" (OuterVolumeSpecName: "kube-api-access-rs6cp") pod "e15c13f0-2f86-4aba-b109-bac20813f7c6" (UID: "e15c13f0-2f86-4aba-b109-bac20813f7c6"). InnerVolumeSpecName "kube-api-access-rs6cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.272716 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3ec590-49ca-4806-9a9e-5699c798e051-kube-api-access-4sml2" (OuterVolumeSpecName: "kube-api-access-4sml2") pod "ce3ec590-49ca-4806-9a9e-5699c798e051" (UID: "ce3ec590-49ca-4806-9a9e-5699c798e051"). InnerVolumeSpecName "kube-api-access-4sml2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.273228 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3-kube-api-access-dv56s" (OuterVolumeSpecName: "kube-api-access-dv56s") pod "cc6bf917-4144-4e4a-9a7a-63aae89d8ad3" (UID: "cc6bf917-4144-4e4a-9a7a-63aae89d8ad3"). InnerVolumeSpecName "kube-api-access-dv56s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.367768 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e15c13f0-2f86-4aba-b109-bac20813f7c6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.367811 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3ec590-49ca-4806-9a9e-5699c798e051-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.367824 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.367837 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sml2\" (UniqueName: \"kubernetes.io/projected/ce3ec590-49ca-4806-9a9e-5699c798e051-kube-api-access-4sml2\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.367853 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv56s\" (UniqueName: \"kubernetes.io/projected/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3-kube-api-access-dv56s\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.367866 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs6cp\" (UniqueName: \"kubernetes.io/projected/e15c13f0-2f86-4aba-b109-bac20813f7c6-kube-api-access-rs6cp\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.639116 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" event={"ID":"ce3ec590-49ca-4806-9a9e-5699c798e051","Type":"ContainerDied","Data":"d3eddf54155392c8d6b6821680ccd07ca8719263d56317d08f89c1610decc21d"} Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.639483 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3eddf54155392c8d6b6821680ccd07ca8719263d56317d08f89c1610decc21d" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.639193 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-38d4-account-create-update-4zn9m" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.650180 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9jbjr" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.650548 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9jbjr" event={"ID":"318faba6-6466-4f38-8f29-b01709e93bea","Type":"ContainerDied","Data":"21087845c39cea04665422cad52e248e4e4eab28f1bcab17b51d4a344abdfd3a"} Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.650600 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21087845c39cea04665422cad52e248e4e4eab28f1bcab17b51d4a344abdfd3a" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.677078 4764 generic.go:334] "Generic (PLEG): container finished" podID="d58ac244-171d-4fd1-bfca-12a13315defd" containerID="0d90fa56c1d74b346d6f50a86b5c83b062a0d2592e856e3737dc0f2b1211143d" exitCode=0 Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.677134 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d58ac244-171d-4fd1-bfca-12a13315defd","Type":"ContainerDied","Data":"0d90fa56c1d74b346d6f50a86b5c83b062a0d2592e856e3737dc0f2b1211143d"} Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.681410 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ffae-account-create-update-vn59x" event={"ID":"cc6bf917-4144-4e4a-9a7a-63aae89d8ad3","Type":"ContainerDied","Data":"229861780cfc3330f2498db827869a6c5d8b94e9396775178a50f43579deb5c0"} Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.681452 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="229861780cfc3330f2498db827869a6c5d8b94e9396775178a50f43579deb5c0" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.681535 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ffae-account-create-update-vn59x" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.684340 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" event={"ID":"e15c13f0-2f86-4aba-b109-bac20813f7c6","Type":"ContainerDied","Data":"24ff090a09c59dd75121deecfc1a1ac278c506ccd883a15ad63d32c6521acbc2"} Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.684365 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24ff090a09c59dd75121deecfc1a1ac278c506ccd883a15ad63d32c6521acbc2" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.684413 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cc83-account-create-update-rcg5p" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.730331 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.776232 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-config-data\") pod \"d58ac244-171d-4fd1-bfca-12a13315defd\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.776284 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7f6z\" (UniqueName: \"kubernetes.io/projected/d58ac244-171d-4fd1-bfca-12a13315defd-kube-api-access-w7f6z\") pod \"d58ac244-171d-4fd1-bfca-12a13315defd\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.776351 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d58ac244-171d-4fd1-bfca-12a13315defd-log-httpd\") pod \"d58ac244-171d-4fd1-bfca-12a13315defd\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.776407 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d58ac244-171d-4fd1-bfca-12a13315defd-run-httpd\") pod \"d58ac244-171d-4fd1-bfca-12a13315defd\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.776482 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-scripts\") pod \"d58ac244-171d-4fd1-bfca-12a13315defd\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.776528 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-combined-ca-bundle\") pod \"d58ac244-171d-4fd1-bfca-12a13315defd\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.776543 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-sg-core-conf-yaml\") pod \"d58ac244-171d-4fd1-bfca-12a13315defd\" (UID: \"d58ac244-171d-4fd1-bfca-12a13315defd\") " Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.777834 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d58ac244-171d-4fd1-bfca-12a13315defd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d58ac244-171d-4fd1-bfca-12a13315defd" (UID: "d58ac244-171d-4fd1-bfca-12a13315defd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.779832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d58ac244-171d-4fd1-bfca-12a13315defd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d58ac244-171d-4fd1-bfca-12a13315defd" (UID: "d58ac244-171d-4fd1-bfca-12a13315defd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.793727 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d58ac244-171d-4fd1-bfca-12a13315defd-kube-api-access-w7f6z" (OuterVolumeSpecName: "kube-api-access-w7f6z") pod "d58ac244-171d-4fd1-bfca-12a13315defd" (UID: "d58ac244-171d-4fd1-bfca-12a13315defd"). InnerVolumeSpecName "kube-api-access-w7f6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.796529 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-scripts" (OuterVolumeSpecName: "scripts") pod "d58ac244-171d-4fd1-bfca-12a13315defd" (UID: "d58ac244-171d-4fd1-bfca-12a13315defd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.811090 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d58ac244-171d-4fd1-bfca-12a13315defd" (UID: "d58ac244-171d-4fd1-bfca-12a13315defd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.855515 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d58ac244-171d-4fd1-bfca-12a13315defd" (UID: "d58ac244-171d-4fd1-bfca-12a13315defd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.875133 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-config-data" (OuterVolumeSpecName: "config-data") pod "d58ac244-171d-4fd1-bfca-12a13315defd" (UID: "d58ac244-171d-4fd1-bfca-12a13315defd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.878612 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.878648 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.878662 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.878675 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d58ac244-171d-4fd1-bfca-12a13315defd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.878688 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7f6z\" (UniqueName: \"kubernetes.io/projected/d58ac244-171d-4fd1-bfca-12a13315defd-kube-api-access-w7f6z\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.878699 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d58ac244-171d-4fd1-bfca-12a13315defd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:09 crc kubenswrapper[4764]: I0320 15:13:09.878709 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d58ac244-171d-4fd1-bfca-12a13315defd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.696683 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d58ac244-171d-4fd1-bfca-12a13315defd","Type":"ContainerDied","Data":"109b71cddf38c7eb36bffddde9010ff840cda217e4b74232ad0a21a3ccbe33bf"} Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.696768 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.697068 4764 scope.go:117] "RemoveContainer" containerID="2b2466b100fb8e8b55d8af9d420ca160e1b2d6c45cb7a1eb2789a7cde9ac60dc" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.726718 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.727887 4764 scope.go:117] "RemoveContainer" containerID="f4c52e5ec68f4e622d08b39821abf77e68e81d8d3cf0b678b3429613f704d1fa" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.745165 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.750147 4764 scope.go:117] "RemoveContainer" containerID="6a7e975d5d96646270ac44cf2b88066ed9a4cd90473ee03382abf176e1f48f80" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.765972 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:13:10 crc kubenswrapper[4764]: E0320 15:13:10.766570 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6bf917-4144-4e4a-9a7a-63aae89d8ad3" containerName="mariadb-account-create-update" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.766676 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6bf917-4144-4e4a-9a7a-63aae89d8ad3" containerName="mariadb-account-create-update" Mar 20 15:13:10 crc kubenswrapper[4764]: E0320 15:13:10.766740 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3ec590-49ca-4806-9a9e-5699c798e051" containerName="mariadb-account-create-update" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.766805 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3ec590-49ca-4806-9a9e-5699c798e051" containerName="mariadb-account-create-update" Mar 20 15:13:10 crc kubenswrapper[4764]: E0320 15:13:10.766868 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="proxy-httpd" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.766919 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="proxy-httpd" Mar 20 15:13:10 crc kubenswrapper[4764]: E0320 15:13:10.766980 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac36915-9830-4e2c-871c-d8b56e780587" containerName="placement-api" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.767031 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac36915-9830-4e2c-871c-d8b56e780587" containerName="placement-api" Mar 20 15:13:10 crc kubenswrapper[4764]: E0320 15:13:10.767083 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="ceilometer-central-agent" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.767132 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="ceilometer-central-agent" Mar 20 15:13:10 crc kubenswrapper[4764]: E0320 15:13:10.767193 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="ceilometer-notification-agent" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.767243 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="ceilometer-notification-agent" Mar 20 15:13:10 crc kubenswrapper[4764]: E0320 15:13:10.767295 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e806e8a-b9b4-47eb-bd27-a70a53705c32" containerName="mariadb-database-create" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.767569 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e806e8a-b9b4-47eb-bd27-a70a53705c32" containerName="mariadb-database-create" Mar 20 15:13:10 crc kubenswrapper[4764]: E0320 15:13:10.767663 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15c13f0-2f86-4aba-b109-bac20813f7c6" containerName="mariadb-account-create-update" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.767718 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15c13f0-2f86-4aba-b109-bac20813f7c6" containerName="mariadb-account-create-update" Mar 20 15:13:10 crc kubenswrapper[4764]: E0320 15:13:10.767776 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318faba6-6466-4f38-8f29-b01709e93bea" containerName="mariadb-database-create" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.768086 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="318faba6-6466-4f38-8f29-b01709e93bea" containerName="mariadb-database-create" Mar 20 15:13:10 crc kubenswrapper[4764]: E0320 15:13:10.768150 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6" containerName="mariadb-database-create" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.768208 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6" containerName="mariadb-database-create" Mar 20 15:13:10 crc kubenswrapper[4764]: E0320 15:13:10.768262 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="sg-core" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.768327 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="sg-core" Mar 20 15:13:10 crc kubenswrapper[4764]: E0320 15:13:10.768413 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac36915-9830-4e2c-871c-d8b56e780587" containerName="placement-log" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.768467 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac36915-9830-4e2c-871c-d8b56e780587" containerName="placement-log" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.768742 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="ceilometer-notification-agent" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.768872 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc6bf917-4144-4e4a-9a7a-63aae89d8ad3" containerName="mariadb-account-create-update" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.768938 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac36915-9830-4e2c-871c-d8b56e780587" containerName="placement-log" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.768993 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="ceilometer-central-agent" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.769055 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e806e8a-b9b4-47eb-bd27-a70a53705c32" containerName="mariadb-database-create" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.769145 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac36915-9830-4e2c-871c-d8b56e780587" containerName="placement-api" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.769211 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="sg-core" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.769266 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6" containerName="mariadb-database-create" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.769331 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3ec590-49ca-4806-9a9e-5699c798e051" containerName="mariadb-account-create-update" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.769422 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" containerName="proxy-httpd" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.769503 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15c13f0-2f86-4aba-b109-bac20813f7c6" containerName="mariadb-account-create-update" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.769569 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="318faba6-6466-4f38-8f29-b01709e93bea" containerName="mariadb-database-create" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.773404 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.778161 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.778278 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.780971 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.805588 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkl5k\" (UniqueName: \"kubernetes.io/projected/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-kube-api-access-hkl5k\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.806181 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-log-httpd\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.806636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-config-data\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.806875 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.807039 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-scripts\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.807275 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.807456 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-run-httpd\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.806215 4764 scope.go:117] "RemoveContainer" containerID="0d90fa56c1d74b346d6f50a86b5c83b062a0d2592e856e3737dc0f2b1211143d" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.909401 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.909461 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-scripts\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.909527 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.909570 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-run-httpd\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.909624 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkl5k\" (UniqueName: \"kubernetes.io/projected/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-kube-api-access-hkl5k\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.909651 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-log-httpd\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.909676 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-config-data\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.910777 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-run-httpd\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.911672 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-log-httpd\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.917815 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.922933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-scripts\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.922998 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.925761 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-config-data\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:10 crc kubenswrapper[4764]: I0320 15:13:10.927069 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkl5k\" (UniqueName: \"kubernetes.io/projected/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-kube-api-access-hkl5k\") pod \"ceilometer-0\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " pod="openstack/ceilometer-0" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.095292 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.146911 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d58ac244-171d-4fd1-bfca-12a13315defd" path="/var/lib/kubelet/pods/d58ac244-171d-4fd1-bfca-12a13315defd/volumes" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.567481 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x4dvd"] Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.568759 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.582223 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.587202 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bnsmb" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.587658 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.605552 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.612842 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x4dvd"] Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.704551 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7","Type":"ContainerStarted","Data":"b9dd7756aa9d019d54307bd93682efde9d26904a1c14eef1edb7f5a7fd3c9c3c"} Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.721201 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xxf7\" (UniqueName: \"kubernetes.io/projected/e1fa9963-3520-46b8-a13f-aa11d6059432-kube-api-access-5xxf7\") pod \"nova-cell0-conductor-db-sync-x4dvd\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.721283 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x4dvd\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.721348 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-scripts\") pod \"nova-cell0-conductor-db-sync-x4dvd\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.721501 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-config-data\") pod \"nova-cell0-conductor-db-sync-x4dvd\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.823516 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xxf7\" (UniqueName: \"kubernetes.io/projected/e1fa9963-3520-46b8-a13f-aa11d6059432-kube-api-access-5xxf7\") pod \"nova-cell0-conductor-db-sync-x4dvd\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.823597 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x4dvd\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.823675 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-scripts\") pod \"nova-cell0-conductor-db-sync-x4dvd\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.823716 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-config-data\") pod \"nova-cell0-conductor-db-sync-x4dvd\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.828777 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-scripts\") pod \"nova-cell0-conductor-db-sync-x4dvd\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.829116 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-config-data\") pod \"nova-cell0-conductor-db-sync-x4dvd\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.829300 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-x4dvd\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.840826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xxf7\" (UniqueName: \"kubernetes.io/projected/e1fa9963-3520-46b8-a13f-aa11d6059432-kube-api-access-5xxf7\") pod \"nova-cell0-conductor-db-sync-x4dvd\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:11 crc kubenswrapper[4764]: I0320 15:13:11.947329 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:12 crc kubenswrapper[4764]: I0320 15:13:12.425637 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x4dvd"] Mar 20 15:13:12 crc kubenswrapper[4764]: I0320 15:13:12.717288 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x4dvd" event={"ID":"e1fa9963-3520-46b8-a13f-aa11d6059432","Type":"ContainerStarted","Data":"3161214fd4ce5e60f74530c697a22811aca5568457001f9069f78c760493e3e9"} Mar 20 15:13:12 crc kubenswrapper[4764]: I0320 15:13:12.723201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7","Type":"ContainerStarted","Data":"b3554e5624054046687768546fdd7ad50317cf38ff12af9fce2437fddefc3e37"} Mar 20 15:13:13 crc kubenswrapper[4764]: I0320 15:13:13.740948 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7","Type":"ContainerStarted","Data":"654f6d9c01cd17a74dd9aea076b39dfb1f920d32a74287a217e5163abf341fe8"} Mar 20 15:13:14 crc kubenswrapper[4764]: I0320 15:13:14.735296 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:13:14 crc kubenswrapper[4764]: I0320 15:13:14.757641 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7","Type":"ContainerStarted","Data":"19e4ed20b110ac826e06bae78c73dbdea6bb01c153836aa43f858906c598e3d9"} Mar 20 15:13:21 crc kubenswrapper[4764]: I0320 15:13:21.826936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7","Type":"ContainerStarted","Data":"59302c5bdbcf7c9ff6d84269ee9aee11bea80df53eb3d29c9cd06ad77519dfe8"} Mar 20 15:13:21 crc kubenswrapper[4764]: I0320 15:13:21.827515 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="ceilometer-central-agent" containerID="cri-o://b3554e5624054046687768546fdd7ad50317cf38ff12af9fce2437fddefc3e37" gracePeriod=30 Mar 20 15:13:21 crc kubenswrapper[4764]: I0320 15:13:21.827735 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:13:21 crc kubenswrapper[4764]: I0320 15:13:21.827977 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="proxy-httpd" containerID="cri-o://59302c5bdbcf7c9ff6d84269ee9aee11bea80df53eb3d29c9cd06ad77519dfe8" gracePeriod=30 Mar 20 15:13:21 crc kubenswrapper[4764]: I0320 15:13:21.828018 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="sg-core" containerID="cri-o://19e4ed20b110ac826e06bae78c73dbdea6bb01c153836aa43f858906c598e3d9" gracePeriod=30 Mar 20 15:13:21 crc kubenswrapper[4764]: I0320 15:13:21.828047 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="ceilometer-notification-agent" containerID="cri-o://654f6d9c01cd17a74dd9aea076b39dfb1f920d32a74287a217e5163abf341fe8" gracePeriod=30 Mar 20 15:13:21 crc kubenswrapper[4764]: I0320 15:13:21.830879 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x4dvd" event={"ID":"e1fa9963-3520-46b8-a13f-aa11d6059432","Type":"ContainerStarted","Data":"76c0c17724d68d2790828c6ae96c078ac5b894f816e6140aae4c872f7b44e425"} Mar 20 15:13:21 crc kubenswrapper[4764]: I0320 15:13:21.857723 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.324755446 podStartE2EDuration="11.857693692s" podCreationTimestamp="2026-03-20 15:13:10 +0000 UTC" firstStartedPulling="2026-03-20 15:13:11.572316866 +0000 UTC m=+1313.188505995" lastFinishedPulling="2026-03-20 15:13:21.105255112 +0000 UTC m=+1322.721444241" observedRunningTime="2026-03-20 15:13:21.854489464 +0000 UTC m=+1323.470678623" watchObservedRunningTime="2026-03-20 15:13:21.857693692 +0000 UTC m=+1323.473882861" Mar 20 15:13:21 crc kubenswrapper[4764]: I0320 15:13:21.883578 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-x4dvd" podStartSLOduration=2.207668878 podStartE2EDuration="10.883554851s" podCreationTimestamp="2026-03-20 15:13:11 +0000 UTC" firstStartedPulling="2026-03-20 15:13:12.430433652 +0000 UTC m=+1314.046622781" lastFinishedPulling="2026-03-20 15:13:21.106319625 +0000 UTC m=+1322.722508754" observedRunningTime="2026-03-20 15:13:21.872933107 +0000 UTC m=+1323.489122236" watchObservedRunningTime="2026-03-20 15:13:21.883554851 +0000 UTC m=+1323.499743990" Mar 20 15:13:22 crc kubenswrapper[4764]: I0320 15:13:22.844555 4764 generic.go:334] "Generic (PLEG): container finished" podID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerID="59302c5bdbcf7c9ff6d84269ee9aee11bea80df53eb3d29c9cd06ad77519dfe8" exitCode=0 Mar 20 15:13:22 crc kubenswrapper[4764]: I0320 15:13:22.844768 4764 generic.go:334] "Generic (PLEG): container finished" podID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerID="19e4ed20b110ac826e06bae78c73dbdea6bb01c153836aa43f858906c598e3d9" exitCode=2 Mar 20 15:13:22 crc kubenswrapper[4764]: I0320 15:13:22.844777 4764 generic.go:334] "Generic (PLEG): container finished" podID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerID="654f6d9c01cd17a74dd9aea076b39dfb1f920d32a74287a217e5163abf341fe8" exitCode=0 Mar 20 15:13:22 crc kubenswrapper[4764]: I0320 15:13:22.844784 4764 generic.go:334] "Generic (PLEG): container finished" podID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerID="b3554e5624054046687768546fdd7ad50317cf38ff12af9fce2437fddefc3e37" exitCode=0 Mar 20 15:13:22 crc kubenswrapper[4764]: I0320 15:13:22.845501 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7","Type":"ContainerDied","Data":"59302c5bdbcf7c9ff6d84269ee9aee11bea80df53eb3d29c9cd06ad77519dfe8"} Mar 20 15:13:22 crc kubenswrapper[4764]: I0320 15:13:22.845523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7","Type":"ContainerDied","Data":"19e4ed20b110ac826e06bae78c73dbdea6bb01c153836aa43f858906c598e3d9"} Mar 20 15:13:22 crc kubenswrapper[4764]: I0320 15:13:22.845533 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7","Type":"ContainerDied","Data":"654f6d9c01cd17a74dd9aea076b39dfb1f920d32a74287a217e5163abf341fe8"} Mar 20 15:13:22 crc kubenswrapper[4764]: I0320 15:13:22.845542 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7","Type":"ContainerDied","Data":"b3554e5624054046687768546fdd7ad50317cf38ff12af9fce2437fddefc3e37"} Mar 20 15:13:22 crc kubenswrapper[4764]: I0320 15:13:22.968296 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.070359 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-sg-core-conf-yaml\") pod \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.070450 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-config-data\") pod \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.070517 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-run-httpd\") pod \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.070537 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-log-httpd\") pod \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.070597 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-scripts\") pod \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.070655 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkl5k\" (UniqueName: \"kubernetes.io/projected/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-kube-api-access-hkl5k\") pod \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.070678 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-combined-ca-bundle\") pod \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\" (UID: \"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7\") " Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.072087 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" (UID: "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.072636 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" (UID: "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.078910 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-scripts" (OuterVolumeSpecName: "scripts") pod "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" (UID: "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.091451 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-kube-api-access-hkl5k" (OuterVolumeSpecName: "kube-api-access-hkl5k") pod "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" (UID: "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7"). InnerVolumeSpecName "kube-api-access-hkl5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.128821 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" (UID: "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.172605 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkl5k\" (UniqueName: \"kubernetes.io/projected/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-kube-api-access-hkl5k\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.172632 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.172651 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.172660 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.172669 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.189805 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-config-data" (OuterVolumeSpecName: "config-data") pod "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" (UID: "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.194643 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" (UID: "8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.275162 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.275203 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.862902 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7","Type":"ContainerDied","Data":"b9dd7756aa9d019d54307bd93682efde9d26904a1c14eef1edb7f5a7fd3c9c3c"} Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.863116 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.863159 4764 scope.go:117] "RemoveContainer" containerID="59302c5bdbcf7c9ff6d84269ee9aee11bea80df53eb3d29c9cd06ad77519dfe8" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.895709 4764 scope.go:117] "RemoveContainer" containerID="19e4ed20b110ac826e06bae78c73dbdea6bb01c153836aa43f858906c598e3d9" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.916271 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.923415 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.950338 4764 scope.go:117] "RemoveContainer" containerID="654f6d9c01cd17a74dd9aea076b39dfb1f920d32a74287a217e5163abf341fe8" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.958590 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:13:23 crc kubenswrapper[4764]: E0320 15:13:23.959000 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="sg-core" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.959017 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="sg-core" Mar 20 15:13:23 crc kubenswrapper[4764]: E0320 15:13:23.959037 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="ceilometer-central-agent" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.959045 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="ceilometer-central-agent" Mar 20 15:13:23 crc kubenswrapper[4764]: E0320 15:13:23.959062 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="proxy-httpd" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.959068 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="proxy-httpd" Mar 20 15:13:23 crc kubenswrapper[4764]: E0320 15:13:23.959083 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="ceilometer-notification-agent" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.959089 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="ceilometer-notification-agent" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.959239 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="ceilometer-notification-agent" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.959254 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="ceilometer-central-agent" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.959265 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="proxy-httpd" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.959284 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" containerName="sg-core" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.960666 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.963197 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.963359 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.973209 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:13:23 crc kubenswrapper[4764]: I0320 15:13:23.982007 4764 scope.go:117] "RemoveContainer" containerID="b3554e5624054046687768546fdd7ad50317cf38ff12af9fce2437fddefc3e37" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.090533 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.090587 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.090656 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qblzc\" (UniqueName: \"kubernetes.io/projected/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-kube-api-access-qblzc\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.090688 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-config-data\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.090736 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-log-httpd\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.090816 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-run-httpd\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.101232 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-scripts\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.203345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-scripts\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.203442 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.203473 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.203506 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qblzc\" (UniqueName: \"kubernetes.io/projected/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-kube-api-access-qblzc\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.204410 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-config-data\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.204507 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-log-httpd\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.204548 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-run-httpd\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.204879 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-run-httpd\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.205391 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-log-httpd\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.208699 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-scripts\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.210507 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-config-data\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.210678 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.212774 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.224037 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qblzc\" (UniqueName: \"kubernetes.io/projected/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-kube-api-access-qblzc\") pod \"ceilometer-0\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.286313 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.804702 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:13:24 crc kubenswrapper[4764]: I0320 15:13:24.871509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b01f9f5c-0c39-4c1c-9052-21c421fd55bd","Type":"ContainerStarted","Data":"f4497b611d0052ddefba78208add4b9d3dd101b374af3825351b3d2e7b8679f4"} Mar 20 15:13:25 crc kubenswrapper[4764]: I0320 15:13:25.138183 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7" path="/var/lib/kubelet/pods/8ccd45ad-7a01-4c1f-90d6-1fc586d6f6e7/volumes" Mar 20 15:13:25 crc kubenswrapper[4764]: I0320 15:13:25.895301 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b01f9f5c-0c39-4c1c-9052-21c421fd55bd","Type":"ContainerStarted","Data":"c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d"} Mar 20 15:13:26 crc kubenswrapper[4764]: I0320 15:13:26.909118 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b01f9f5c-0c39-4c1c-9052-21c421fd55bd","Type":"ContainerStarted","Data":"bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477"} Mar 20 15:13:27 crc kubenswrapper[4764]: I0320 15:13:27.920233 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b01f9f5c-0c39-4c1c-9052-21c421fd55bd","Type":"ContainerStarted","Data":"854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717"} Mar 20 15:13:29 crc kubenswrapper[4764]: I0320 15:13:29.947463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b01f9f5c-0c39-4c1c-9052-21c421fd55bd","Type":"ContainerStarted","Data":"4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389"} Mar 20 15:13:29 crc kubenswrapper[4764]: I0320 15:13:29.976422 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.502224098 podStartE2EDuration="6.97640474s" podCreationTimestamp="2026-03-20 15:13:23 +0000 UTC" firstStartedPulling="2026-03-20 15:13:24.798713072 +0000 UTC m=+1326.414902201" lastFinishedPulling="2026-03-20 15:13:29.272893714 +0000 UTC m=+1330.889082843" observedRunningTime="2026-03-20 15:13:29.973636175 +0000 UTC m=+1331.589825304" watchObservedRunningTime="2026-03-20 15:13:29.97640474 +0000 UTC m=+1331.592593869" Mar 20 15:13:30 crc kubenswrapper[4764]: I0320 15:13:30.955451 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:13:31 crc kubenswrapper[4764]: I0320 15:13:31.965324 4764 generic.go:334] "Generic (PLEG): container finished" podID="e1fa9963-3520-46b8-a13f-aa11d6059432" containerID="76c0c17724d68d2790828c6ae96c078ac5b894f816e6140aae4c872f7b44e425" exitCode=0 Mar 20 15:13:31 crc kubenswrapper[4764]: I0320 15:13:31.965421 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x4dvd" event={"ID":"e1fa9963-3520-46b8-a13f-aa11d6059432","Type":"ContainerDied","Data":"76c0c17724d68d2790828c6ae96c078ac5b894f816e6140aae4c872f7b44e425"} Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.295703 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.378539 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xxf7\" (UniqueName: \"kubernetes.io/projected/e1fa9963-3520-46b8-a13f-aa11d6059432-kube-api-access-5xxf7\") pod \"e1fa9963-3520-46b8-a13f-aa11d6059432\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.378679 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-config-data\") pod \"e1fa9963-3520-46b8-a13f-aa11d6059432\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.378716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-combined-ca-bundle\") pod \"e1fa9963-3520-46b8-a13f-aa11d6059432\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.378749 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-scripts\") pod \"e1fa9963-3520-46b8-a13f-aa11d6059432\" (UID: \"e1fa9963-3520-46b8-a13f-aa11d6059432\") " Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.386611 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-scripts" (OuterVolumeSpecName: "scripts") pod "e1fa9963-3520-46b8-a13f-aa11d6059432" (UID: "e1fa9963-3520-46b8-a13f-aa11d6059432"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.386676 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1fa9963-3520-46b8-a13f-aa11d6059432-kube-api-access-5xxf7" (OuterVolumeSpecName: "kube-api-access-5xxf7") pod "e1fa9963-3520-46b8-a13f-aa11d6059432" (UID: "e1fa9963-3520-46b8-a13f-aa11d6059432"). InnerVolumeSpecName "kube-api-access-5xxf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.406641 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1fa9963-3520-46b8-a13f-aa11d6059432" (UID: "e1fa9963-3520-46b8-a13f-aa11d6059432"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.425826 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-config-data" (OuterVolumeSpecName: "config-data") pod "e1fa9963-3520-46b8-a13f-aa11d6059432" (UID: "e1fa9963-3520-46b8-a13f-aa11d6059432"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.480857 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xxf7\" (UniqueName: \"kubernetes.io/projected/e1fa9963-3520-46b8-a13f-aa11d6059432-kube-api-access-5xxf7\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.480890 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.480904 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.480916 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1fa9963-3520-46b8-a13f-aa11d6059432-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.992130 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-x4dvd" event={"ID":"e1fa9963-3520-46b8-a13f-aa11d6059432","Type":"ContainerDied","Data":"3161214fd4ce5e60f74530c697a22811aca5568457001f9069f78c760493e3e9"} Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.992471 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3161214fd4ce5e60f74530c697a22811aca5568457001f9069f78c760493e3e9" Mar 20 15:13:33 crc kubenswrapper[4764]: I0320 15:13:33.992202 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-x4dvd" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.097514 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 15:13:34 crc kubenswrapper[4764]: E0320 15:13:34.097999 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1fa9963-3520-46b8-a13f-aa11d6059432" containerName="nova-cell0-conductor-db-sync" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.098017 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1fa9963-3520-46b8-a13f-aa11d6059432" containerName="nova-cell0-conductor-db-sync" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.098295 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1fa9963-3520-46b8-a13f-aa11d6059432" containerName="nova-cell0-conductor-db-sync" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.099001 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.106122 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bnsmb" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.107203 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.115343 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.194052 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj8n4\" (UniqueName: \"kubernetes.io/projected/700deb88-8c5b-470d-a686-664ec01cc1e4-kube-api-access-rj8n4\") pod \"nova-cell0-conductor-0\" (UID: \"700deb88-8c5b-470d-a686-664ec01cc1e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.194273 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700deb88-8c5b-470d-a686-664ec01cc1e4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"700deb88-8c5b-470d-a686-664ec01cc1e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.194325 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700deb88-8c5b-470d-a686-664ec01cc1e4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"700deb88-8c5b-470d-a686-664ec01cc1e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.295811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700deb88-8c5b-470d-a686-664ec01cc1e4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"700deb88-8c5b-470d-a686-664ec01cc1e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.295881 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700deb88-8c5b-470d-a686-664ec01cc1e4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"700deb88-8c5b-470d-a686-664ec01cc1e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.295920 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj8n4\" (UniqueName: \"kubernetes.io/projected/700deb88-8c5b-470d-a686-664ec01cc1e4-kube-api-access-rj8n4\") pod \"nova-cell0-conductor-0\" (UID: \"700deb88-8c5b-470d-a686-664ec01cc1e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.300777 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700deb88-8c5b-470d-a686-664ec01cc1e4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"700deb88-8c5b-470d-a686-664ec01cc1e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.301912 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700deb88-8c5b-470d-a686-664ec01cc1e4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"700deb88-8c5b-470d-a686-664ec01cc1e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.313016 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj8n4\" (UniqueName: \"kubernetes.io/projected/700deb88-8c5b-470d-a686-664ec01cc1e4-kube-api-access-rj8n4\") pod \"nova-cell0-conductor-0\" (UID: \"700deb88-8c5b-470d-a686-664ec01cc1e4\") " pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.433430 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:34 crc kubenswrapper[4764]: I0320 15:13:34.861082 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 15:13:35 crc kubenswrapper[4764]: I0320 15:13:35.001113 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"700deb88-8c5b-470d-a686-664ec01cc1e4","Type":"ContainerStarted","Data":"02b3c11b714b6f6699e526169174ae3118db7b58ab5c5a280f1a24673321d27d"} Mar 20 15:13:36 crc kubenswrapper[4764]: I0320 15:13:36.015136 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"700deb88-8c5b-470d-a686-664ec01cc1e4","Type":"ContainerStarted","Data":"9853de765354ce8adfceadfeca7d0cec4b03edf28776fdb0660a5445e16ceb5f"} Mar 20 15:13:36 crc kubenswrapper[4764]: I0320 15:13:36.015702 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:36 crc kubenswrapper[4764]: I0320 15:13:36.045604 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.045587773 podStartE2EDuration="2.045587773s" podCreationTimestamp="2026-03-20 15:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:13:36.038099374 +0000 UTC m=+1337.654288513" watchObservedRunningTime="2026-03-20 15:13:36.045587773 +0000 UTC m=+1337.661776902" Mar 20 15:13:38 crc kubenswrapper[4764]: I0320 15:13:38.444140 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:13:38 crc kubenswrapper[4764]: I0320 15:13:38.444479 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:13:44 crc kubenswrapper[4764]: I0320 15:13:44.474352 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.023165 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-cr8wd"] Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.024493 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.026829 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.037615 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.045599 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cr8wd"] Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.113889 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-scripts\") pod \"nova-cell0-cell-mapping-cr8wd\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.114011 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngdr6\" (UniqueName: \"kubernetes.io/projected/15a21459-b477-4b13-847f-4997f3c4529f-kube-api-access-ngdr6\") pod \"nova-cell0-cell-mapping-cr8wd\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.114099 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cr8wd\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.114126 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-config-data\") pod \"nova-cell0-cell-mapping-cr8wd\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.167771 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.169274 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.171343 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.203082 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.217333 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-scripts\") pod \"nova-cell0-cell-mapping-cr8wd\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.217441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngdr6\" (UniqueName: \"kubernetes.io/projected/15a21459-b477-4b13-847f-4997f3c4529f-kube-api-access-ngdr6\") pod \"nova-cell0-cell-mapping-cr8wd\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.217496 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cr8wd\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.217512 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-config-data\") pod \"nova-cell0-cell-mapping-cr8wd\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.250106 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-config-data\") pod \"nova-cell0-cell-mapping-cr8wd\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.253998 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-scripts\") pod \"nova-cell0-cell-mapping-cr8wd\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.275107 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.276355 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.279512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cr8wd\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.280207 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.294531 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.295976 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngdr6\" (UniqueName: \"kubernetes.io/projected/15a21459-b477-4b13-847f-4997f3c4529f-kube-api-access-ngdr6\") pod \"nova-cell0-cell-mapping-cr8wd\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.321280 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.321330 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-logs\") pod \"nova-api-0\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.321419 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x9sp\" (UniqueName: \"kubernetes.io/projected/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-kube-api-access-4x9sp\") pod \"nova-api-0\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.321444 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-config-data\") pod \"nova-api-0\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.321516 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.322581 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.329969 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.335838 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.369587 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.402397 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.406287 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.412760 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.417131 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.423333 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a17efe33-2090-4201-be2e-0add4be515a7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a17efe33-2090-4201-be2e-0add4be515a7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.423596 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9jnc\" (UniqueName: \"kubernetes.io/projected/570012fa-4a72-4a1e-905d-3e46c21da637-kube-api-access-p9jnc\") pod \"nova-scheduler-0\" (UID: \"570012fa-4a72-4a1e-905d-3e46c21da637\") " pod="openstack/nova-scheduler-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.423622 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570012fa-4a72-4a1e-905d-3e46c21da637-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"570012fa-4a72-4a1e-905d-3e46c21da637\") " pod="openstack/nova-scheduler-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.423644 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a17efe33-2090-4201-be2e-0add4be515a7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a17efe33-2090-4201-be2e-0add4be515a7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.423686 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.423716 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-logs\") pod \"nova-api-0\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.423737 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570012fa-4a72-4a1e-905d-3e46c21da637-config-data\") pod \"nova-scheduler-0\" (UID: \"570012fa-4a72-4a1e-905d-3e46c21da637\") " pod="openstack/nova-scheduler-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.423809 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x9sp\" (UniqueName: \"kubernetes.io/projected/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-kube-api-access-4x9sp\") pod \"nova-api-0\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.423836 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czm2g\" (UniqueName: \"kubernetes.io/projected/a17efe33-2090-4201-be2e-0add4be515a7-kube-api-access-czm2g\") pod \"nova-cell1-novncproxy-0\" (UID: \"a17efe33-2090-4201-be2e-0add4be515a7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.423852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-config-data\") pod \"nova-api-0\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.427947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-logs\") pod \"nova-api-0\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.430960 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-config-data\") pod \"nova-api-0\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.445930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.476820 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x9sp\" (UniqueName: \"kubernetes.io/projected/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-kube-api-access-4x9sp\") pod \"nova-api-0\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.503025 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.504347 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7fnxn"] Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.505803 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.526030 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f6741cd-7933-46a6-b210-b801b26038e9-logs\") pod \"nova-metadata-0\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.526105 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f6741cd-7933-46a6-b210-b801b26038e9-config-data\") pod \"nova-metadata-0\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.526137 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czm2g\" (UniqueName: \"kubernetes.io/projected/a17efe33-2090-4201-be2e-0add4be515a7-kube-api-access-czm2g\") pod \"nova-cell1-novncproxy-0\" (UID: \"a17efe33-2090-4201-be2e-0add4be515a7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.526177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f6741cd-7933-46a6-b210-b801b26038e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.526201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a17efe33-2090-4201-be2e-0add4be515a7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a17efe33-2090-4201-be2e-0add4be515a7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.526218 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9jnc\" (UniqueName: \"kubernetes.io/projected/570012fa-4a72-4a1e-905d-3e46c21da637-kube-api-access-p9jnc\") pod \"nova-scheduler-0\" (UID: \"570012fa-4a72-4a1e-905d-3e46c21da637\") " pod="openstack/nova-scheduler-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.526240 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570012fa-4a72-4a1e-905d-3e46c21da637-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"570012fa-4a72-4a1e-905d-3e46c21da637\") " pod="openstack/nova-scheduler-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.526258 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m8d2\" (UniqueName: \"kubernetes.io/projected/1f6741cd-7933-46a6-b210-b801b26038e9-kube-api-access-6m8d2\") pod \"nova-metadata-0\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.526278 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a17efe33-2090-4201-be2e-0add4be515a7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a17efe33-2090-4201-be2e-0add4be515a7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.526322 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570012fa-4a72-4a1e-905d-3e46c21da637-config-data\") pod \"nova-scheduler-0\" (UID: \"570012fa-4a72-4a1e-905d-3e46c21da637\") " pod="openstack/nova-scheduler-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.531038 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570012fa-4a72-4a1e-905d-3e46c21da637-config-data\") pod \"nova-scheduler-0\" (UID: \"570012fa-4a72-4a1e-905d-3e46c21da637\") " pod="openstack/nova-scheduler-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.532503 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570012fa-4a72-4a1e-905d-3e46c21da637-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"570012fa-4a72-4a1e-905d-3e46c21da637\") " pod="openstack/nova-scheduler-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.533012 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a17efe33-2090-4201-be2e-0add4be515a7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a17efe33-2090-4201-be2e-0add4be515a7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.534334 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7fnxn"] Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.554035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czm2g\" (UniqueName: \"kubernetes.io/projected/a17efe33-2090-4201-be2e-0add4be515a7-kube-api-access-czm2g\") pod \"nova-cell1-novncproxy-0\" (UID: \"a17efe33-2090-4201-be2e-0add4be515a7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.554797 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9jnc\" (UniqueName: \"kubernetes.io/projected/570012fa-4a72-4a1e-905d-3e46c21da637-kube-api-access-p9jnc\") pod \"nova-scheduler-0\" (UID: \"570012fa-4a72-4a1e-905d-3e46c21da637\") " pod="openstack/nova-scheduler-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.556512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a17efe33-2090-4201-be2e-0add4be515a7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a17efe33-2090-4201-be2e-0add4be515a7\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.627846 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.627907 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-dns-svc\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.628126 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f6741cd-7933-46a6-b210-b801b26038e9-logs\") pod \"nova-metadata-0\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.628257 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.628481 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fjnr\" (UniqueName: \"kubernetes.io/projected/e1cb5067-4f80-440e-9c1e-c422e012190d-kube-api-access-2fjnr\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.628572 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f6741cd-7933-46a6-b210-b801b26038e9-config-data\") pod \"nova-metadata-0\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.628609 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f6741cd-7933-46a6-b210-b801b26038e9-logs\") pod \"nova-metadata-0\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.628668 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.628769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f6741cd-7933-46a6-b210-b801b26038e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.628897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m8d2\" (UniqueName: \"kubernetes.io/projected/1f6741cd-7933-46a6-b210-b801b26038e9-kube-api-access-6m8d2\") pod \"nova-metadata-0\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.628961 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-config\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.637269 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f6741cd-7933-46a6-b210-b801b26038e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.638413 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f6741cd-7933-46a6-b210-b801b26038e9-config-data\") pod \"nova-metadata-0\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.649013 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m8d2\" (UniqueName: \"kubernetes.io/projected/1f6741cd-7933-46a6-b210-b801b26038e9-kube-api-access-6m8d2\") pod \"nova-metadata-0\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.697699 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.730194 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-dns-svc\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.730267 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.730301 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fjnr\" (UniqueName: \"kubernetes.io/projected/e1cb5067-4f80-440e-9c1e-c422e012190d-kube-api-access-2fjnr\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.730341 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.730414 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-config\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.730439 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.731281 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.731981 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-dns-svc\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.732601 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.734504 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.735162 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-config\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.748828 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fjnr\" (UniqueName: \"kubernetes.io/projected/e1cb5067-4f80-440e-9c1e-c422e012190d-kube-api-access-2fjnr\") pod \"dnsmasq-dns-757b4f8459-7fnxn\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.805022 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.826214 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.844951 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:45 crc kubenswrapper[4764]: I0320 15:13:45.966113 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cr8wd"] Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.091912 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:13:46 crc kubenswrapper[4764]: W0320 15:13:46.101446 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa12ad80_b4a5_4af1_bbe6_1e28e97acb2d.slice/crio-a498f1d783b096f5242c4e24a3d880a5018b96fa69cb776d96ef70bee7277158 WatchSource:0}: Error finding container a498f1d783b096f5242c4e24a3d880a5018b96fa69cb776d96ef70bee7277158: Status 404 returned error can't find the container with id a498f1d783b096f5242c4e24a3d880a5018b96fa69cb776d96ef70bee7277158 Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.132987 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.135923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cr8wd" event={"ID":"15a21459-b477-4b13-847f-4997f3c4529f","Type":"ContainerStarted","Data":"465e69aac46156a4a78116d5e2a9aab0017222fa4d65760be0f3a961c470bb97"} Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.141228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d","Type":"ContainerStarted","Data":"a498f1d783b096f5242c4e24a3d880a5018b96fa69cb776d96ef70bee7277158"} Mar 20 15:13:46 crc kubenswrapper[4764]: W0320 15:13:46.148207 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod570012fa_4a72_4a1e_905d_3e46c21da637.slice/crio-8cdba5767c08d3e4f7bc938ce460e87366186f92942495e08bd23e9763463970 WatchSource:0}: Error finding container 8cdba5767c08d3e4f7bc938ce460e87366186f92942495e08bd23e9763463970: Status 404 returned error can't find the container with id 8cdba5767c08d3e4f7bc938ce460e87366186f92942495e08bd23e9763463970 Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.239069 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k8k8m"] Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.241466 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.243444 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.243613 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.267830 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k8k8m"] Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.295724 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.341627 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-scripts\") pod \"nova-cell1-conductor-db-sync-k8k8m\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.341691 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-config-data\") pod \"nova-cell1-conductor-db-sync-k8k8m\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.342164 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k8k8m\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.342426 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtxgm\" (UniqueName: \"kubernetes.io/projected/9a4d6564-31b4-4743-8acd-d1a431370201-kube-api-access-qtxgm\") pod \"nova-cell1-conductor-db-sync-k8k8m\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: W0320 15:13:46.433635 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f6741cd_7933_46a6_b210_b801b26038e9.slice/crio-ddac0000cc1a92294b067f6f9d0e472adc59fcda26be9623b45e2c65fb150f13 WatchSource:0}: Error finding container ddac0000cc1a92294b067f6f9d0e472adc59fcda26be9623b45e2c65fb150f13: Status 404 returned error can't find the container with id ddac0000cc1a92294b067f6f9d0e472adc59fcda26be9623b45e2c65fb150f13 Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.451406 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.466768 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k8k8m\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.466881 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtxgm\" (UniqueName: \"kubernetes.io/projected/9a4d6564-31b4-4743-8acd-d1a431370201-kube-api-access-qtxgm\") pod \"nova-cell1-conductor-db-sync-k8k8m\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.467036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-scripts\") pod \"nova-cell1-conductor-db-sync-k8k8m\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.467080 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-config-data\") pod \"nova-cell1-conductor-db-sync-k8k8m\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.467796 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7fnxn"] Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.472689 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-scripts\") pod \"nova-cell1-conductor-db-sync-k8k8m\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.473557 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-k8k8m\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.485078 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-config-data\") pod \"nova-cell1-conductor-db-sync-k8k8m\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.485524 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtxgm\" (UniqueName: \"kubernetes.io/projected/9a4d6564-31b4-4743-8acd-d1a431370201-kube-api-access-qtxgm\") pod \"nova-cell1-conductor-db-sync-k8k8m\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:46 crc kubenswrapper[4764]: I0320 15:13:46.640228 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:47 crc kubenswrapper[4764]: I0320 15:13:47.119340 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k8k8m"] Mar 20 15:13:47 crc kubenswrapper[4764]: I0320 15:13:47.168475 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a17efe33-2090-4201-be2e-0add4be515a7","Type":"ContainerStarted","Data":"16ebc0abd673a6d450c4f8b6c149d9d93b769abcf48df3feefcf1545c23a67dc"} Mar 20 15:13:47 crc kubenswrapper[4764]: I0320 15:13:47.171177 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cr8wd" event={"ID":"15a21459-b477-4b13-847f-4997f3c4529f","Type":"ContainerStarted","Data":"64406e85d4e06b92e3255fc64afe33bd8499a8cd5401e852428f006ced123ff7"} Mar 20 15:13:47 crc kubenswrapper[4764]: I0320 15:13:47.175309 4764 generic.go:334] "Generic (PLEG): container finished" podID="e1cb5067-4f80-440e-9c1e-c422e012190d" containerID="26ed4d66726c21a059cb2bce0b8e44bd4110fcd78ec96d1d90e14cb266212972" exitCode=0 Mar 20 15:13:47 crc kubenswrapper[4764]: I0320 15:13:47.175402 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" event={"ID":"e1cb5067-4f80-440e-9c1e-c422e012190d","Type":"ContainerDied","Data":"26ed4d66726c21a059cb2bce0b8e44bd4110fcd78ec96d1d90e14cb266212972"} Mar 20 15:13:47 crc kubenswrapper[4764]: I0320 15:13:47.175428 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" event={"ID":"e1cb5067-4f80-440e-9c1e-c422e012190d","Type":"ContainerStarted","Data":"e415b701cad069ab8a5ecb24fc0dbd217ce333c67a7a91c8ba16b449bcef54e1"} Mar 20 15:13:47 crc kubenswrapper[4764]: I0320 15:13:47.177183 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"570012fa-4a72-4a1e-905d-3e46c21da637","Type":"ContainerStarted","Data":"8cdba5767c08d3e4f7bc938ce460e87366186f92942495e08bd23e9763463970"} Mar 20 15:13:47 crc kubenswrapper[4764]: I0320 15:13:47.180745 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f6741cd-7933-46a6-b210-b801b26038e9","Type":"ContainerStarted","Data":"ddac0000cc1a92294b067f6f9d0e472adc59fcda26be9623b45e2c65fb150f13"} Mar 20 15:13:47 crc kubenswrapper[4764]: I0320 15:13:47.194116 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-cr8wd" podStartSLOduration=2.194095941 podStartE2EDuration="2.194095941s" podCreationTimestamp="2026-03-20 15:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:13:47.186649943 +0000 UTC m=+1348.802839072" watchObservedRunningTime="2026-03-20 15:13:47.194095941 +0000 UTC m=+1348.810285080" Mar 20 15:13:48 crc kubenswrapper[4764]: I0320 15:13:48.195727 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k8k8m" event={"ID":"9a4d6564-31b4-4743-8acd-d1a431370201","Type":"ContainerStarted","Data":"e656f614766f094b6c759eccf2b240cdb5224a737c99af67801a7606c56a3898"} Mar 20 15:13:48 crc kubenswrapper[4764]: I0320 15:13:48.196681 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k8k8m" event={"ID":"9a4d6564-31b4-4743-8acd-d1a431370201","Type":"ContainerStarted","Data":"3ca06a5c96234e798344ea65027bfc81128e8b93b45fbc477e6e1e02cfdad07f"} Mar 20 15:13:48 crc kubenswrapper[4764]: I0320 15:13:48.200579 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" event={"ID":"e1cb5067-4f80-440e-9c1e-c422e012190d","Type":"ContainerStarted","Data":"5ca67c7488940d22b3e29c7c1ec2d664c3528065a3fbb57b032a20df5efa9e58"} Mar 20 15:13:48 crc kubenswrapper[4764]: I0320 15:13:48.200645 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:48 crc kubenswrapper[4764]: I0320 15:13:48.219241 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-k8k8m" podStartSLOduration=2.219223748 podStartE2EDuration="2.219223748s" podCreationTimestamp="2026-03-20 15:13:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:13:48.208695386 +0000 UTC m=+1349.824884535" watchObservedRunningTime="2026-03-20 15:13:48.219223748 +0000 UTC m=+1349.835412877" Mar 20 15:13:48 crc kubenswrapper[4764]: I0320 15:13:48.236989 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" podStartSLOduration=3.23697031 podStartE2EDuration="3.23697031s" podCreationTimestamp="2026-03-20 15:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:13:48.230778661 +0000 UTC m=+1349.846967780" watchObservedRunningTime="2026-03-20 15:13:48.23697031 +0000 UTC m=+1349.853159439" Mar 20 15:13:48 crc kubenswrapper[4764]: I0320 15:13:48.567439 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:48 crc kubenswrapper[4764]: I0320 15:13:48.627620 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.244124 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f6741cd-7933-46a6-b210-b801b26038e9","Type":"ContainerStarted","Data":"cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7"} Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.244721 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f6741cd-7933-46a6-b210-b801b26038e9","Type":"ContainerStarted","Data":"4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981"} Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.244849 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1f6741cd-7933-46a6-b210-b801b26038e9" containerName="nova-metadata-log" containerID="cri-o://4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981" gracePeriod=30 Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.245449 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1f6741cd-7933-46a6-b210-b801b26038e9" containerName="nova-metadata-metadata" containerID="cri-o://cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7" gracePeriod=30 Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.255736 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a17efe33-2090-4201-be2e-0add4be515a7","Type":"ContainerStarted","Data":"ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2"} Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.255871 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a17efe33-2090-4201-be2e-0add4be515a7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2" gracePeriod=30 Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.268998 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.308012479 podStartE2EDuration="5.268981088s" podCreationTimestamp="2026-03-20 15:13:45 +0000 UTC" firstStartedPulling="2026-03-20 15:13:46.455021209 +0000 UTC m=+1348.071210328" lastFinishedPulling="2026-03-20 15:13:49.415989818 +0000 UTC m=+1351.032178937" observedRunningTime="2026-03-20 15:13:50.264816221 +0000 UTC m=+1351.881005350" watchObservedRunningTime="2026-03-20 15:13:50.268981088 +0000 UTC m=+1351.885170217" Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.291202 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d","Type":"ContainerStarted","Data":"682152bd8dec83672f26bef33893f919ebd46ba3f0609609bf4964b314a37c7b"} Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.291525 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d","Type":"ContainerStarted","Data":"01c8782d7264a74c29d540468f39e72803570801ecf6a58f7fdbfc7f99a1f72f"} Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.296200 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"570012fa-4a72-4a1e-905d-3e46c21da637","Type":"ContainerStarted","Data":"2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea"} Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.298013 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.173914764 podStartE2EDuration="5.297998074s" podCreationTimestamp="2026-03-20 15:13:45 +0000 UTC" firstStartedPulling="2026-03-20 15:13:46.287545874 +0000 UTC m=+1347.903735003" lastFinishedPulling="2026-03-20 15:13:49.411629184 +0000 UTC m=+1351.027818313" observedRunningTime="2026-03-20 15:13:50.292888998 +0000 UTC m=+1351.909078137" watchObservedRunningTime="2026-03-20 15:13:50.297998074 +0000 UTC m=+1351.914187203" Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.313361 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.004892243 podStartE2EDuration="5.313348593s" podCreationTimestamp="2026-03-20 15:13:45 +0000 UTC" firstStartedPulling="2026-03-20 15:13:46.103161493 +0000 UTC m=+1347.719350622" lastFinishedPulling="2026-03-20 15:13:49.411617833 +0000 UTC m=+1351.027806972" observedRunningTime="2026-03-20 15:13:50.311834307 +0000 UTC m=+1351.928023436" watchObservedRunningTime="2026-03-20 15:13:50.313348593 +0000 UTC m=+1351.929537722" Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.334621 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.073352193 podStartE2EDuration="5.334605682s" podCreationTimestamp="2026-03-20 15:13:45 +0000 UTC" firstStartedPulling="2026-03-20 15:13:46.153108268 +0000 UTC m=+1347.769297397" lastFinishedPulling="2026-03-20 15:13:49.414361767 +0000 UTC m=+1351.030550886" observedRunningTime="2026-03-20 15:13:50.329186857 +0000 UTC m=+1351.945375986" watchObservedRunningTime="2026-03-20 15:13:50.334605682 +0000 UTC m=+1351.950794801" Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.697859 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.805938 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.891462 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.981553 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f6741cd-7933-46a6-b210-b801b26038e9-combined-ca-bundle\") pod \"1f6741cd-7933-46a6-b210-b801b26038e9\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.981625 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m8d2\" (UniqueName: \"kubernetes.io/projected/1f6741cd-7933-46a6-b210-b801b26038e9-kube-api-access-6m8d2\") pod \"1f6741cd-7933-46a6-b210-b801b26038e9\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.981660 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f6741cd-7933-46a6-b210-b801b26038e9-logs\") pod \"1f6741cd-7933-46a6-b210-b801b26038e9\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.981849 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f6741cd-7933-46a6-b210-b801b26038e9-config-data\") pod \"1f6741cd-7933-46a6-b210-b801b26038e9\" (UID: \"1f6741cd-7933-46a6-b210-b801b26038e9\") " Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.982159 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f6741cd-7933-46a6-b210-b801b26038e9-logs" (OuterVolumeSpecName: "logs") pod "1f6741cd-7933-46a6-b210-b801b26038e9" (UID: "1f6741cd-7933-46a6-b210-b801b26038e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.982549 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f6741cd-7933-46a6-b210-b801b26038e9-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:50 crc kubenswrapper[4764]: I0320 15:13:50.987284 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f6741cd-7933-46a6-b210-b801b26038e9-kube-api-access-6m8d2" (OuterVolumeSpecName: "kube-api-access-6m8d2") pod "1f6741cd-7933-46a6-b210-b801b26038e9" (UID: "1f6741cd-7933-46a6-b210-b801b26038e9"). InnerVolumeSpecName "kube-api-access-6m8d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.008597 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f6741cd-7933-46a6-b210-b801b26038e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f6741cd-7933-46a6-b210-b801b26038e9" (UID: "1f6741cd-7933-46a6-b210-b801b26038e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.033313 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f6741cd-7933-46a6-b210-b801b26038e9-config-data" (OuterVolumeSpecName: "config-data") pod "1f6741cd-7933-46a6-b210-b801b26038e9" (UID: "1f6741cd-7933-46a6-b210-b801b26038e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.084998 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f6741cd-7933-46a6-b210-b801b26038e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.085173 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m8d2\" (UniqueName: \"kubernetes.io/projected/1f6741cd-7933-46a6-b210-b801b26038e9-kube-api-access-6m8d2\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.085197 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f6741cd-7933-46a6-b210-b801b26038e9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.307149 4764 generic.go:334] "Generic (PLEG): container finished" podID="1f6741cd-7933-46a6-b210-b801b26038e9" containerID="cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7" exitCode=0 Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.307176 4764 generic.go:334] "Generic (PLEG): container finished" podID="1f6741cd-7933-46a6-b210-b801b26038e9" containerID="4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981" exitCode=143 Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.307755 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.308605 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f6741cd-7933-46a6-b210-b801b26038e9","Type":"ContainerDied","Data":"cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7"} Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.308642 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f6741cd-7933-46a6-b210-b801b26038e9","Type":"ContainerDied","Data":"4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981"} Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.308653 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f6741cd-7933-46a6-b210-b801b26038e9","Type":"ContainerDied","Data":"ddac0000cc1a92294b067f6f9d0e472adc59fcda26be9623b45e2c65fb150f13"} Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.308667 4764 scope.go:117] "RemoveContainer" containerID="cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.334333 4764 scope.go:117] "RemoveContainer" containerID="4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.340649 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.370342 4764 scope.go:117] "RemoveContainer" containerID="cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7" Mar 20 15:13:51 crc kubenswrapper[4764]: E0320 15:13:51.372682 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7\": container with ID starting with cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7 not found: ID does not exist" containerID="cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.372709 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7"} err="failed to get container status \"cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7\": rpc error: code = NotFound desc = could not find container \"cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7\": container with ID starting with cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7 not found: ID does not exist" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.372728 4764 scope.go:117] "RemoveContainer" containerID="4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981" Mar 20 15:13:51 crc kubenswrapper[4764]: E0320 15:13:51.377096 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981\": container with ID starting with 4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981 not found: ID does not exist" containerID="4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.377179 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981"} err="failed to get container status \"4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981\": rpc error: code = NotFound desc = could not find container \"4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981\": container with ID starting with 4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981 not found: ID does not exist" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.377213 4764 scope.go:117] "RemoveContainer" containerID="cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.380773 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7"} err="failed to get container status \"cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7\": rpc error: code = NotFound desc = could not find container \"cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7\": container with ID starting with cb21e0986d344f552a3488dd8c9f62bff2996ef33d80e98ae2d4d7f6a25592f7 not found: ID does not exist" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.380820 4764 scope.go:117] "RemoveContainer" containerID="4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.385511 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.386830 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981"} err="failed to get container status \"4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981\": rpc error: code = NotFound desc = could not find container \"4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981\": container with ID starting with 4b8fdea9e956162e9243aa2a8abc2f2c7f38e706fc922ecf7dfe243ec01a3981 not found: ID does not exist" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.405323 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:51 crc kubenswrapper[4764]: E0320 15:13:51.405783 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f6741cd-7933-46a6-b210-b801b26038e9" containerName="nova-metadata-metadata" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.405801 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f6741cd-7933-46a6-b210-b801b26038e9" containerName="nova-metadata-metadata" Mar 20 15:13:51 crc kubenswrapper[4764]: E0320 15:13:51.405840 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f6741cd-7933-46a6-b210-b801b26038e9" containerName="nova-metadata-log" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.405846 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f6741cd-7933-46a6-b210-b801b26038e9" containerName="nova-metadata-log" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.406029 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f6741cd-7933-46a6-b210-b801b26038e9" containerName="nova-metadata-log" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.406043 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f6741cd-7933-46a6-b210-b801b26038e9" containerName="nova-metadata-metadata" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.407040 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.411000 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.415563 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 15:13:51 crc kubenswrapper[4764]: E0320 15:13:51.426373 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f6741cd_7933_46a6_b210_b801b26038e9.slice\": RecentStats: unable to find data in memory cache]" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.461913 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.595860 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.596458 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d339190-1b2c-43d7-91fb-86ec63bee9c7-logs\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.596736 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf27n\" (UniqueName: \"kubernetes.io/projected/9d339190-1b2c-43d7-91fb-86ec63bee9c7-kube-api-access-gf27n\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.596995 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.597462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-config-data\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.699212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-config-data\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.699330 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.699369 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d339190-1b2c-43d7-91fb-86ec63bee9c7-logs\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.699410 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf27n\" (UniqueName: \"kubernetes.io/projected/9d339190-1b2c-43d7-91fb-86ec63bee9c7-kube-api-access-gf27n\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.699437 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.700208 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d339190-1b2c-43d7-91fb-86ec63bee9c7-logs\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.704420 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.705341 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-config-data\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.710920 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.729834 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf27n\" (UniqueName: \"kubernetes.io/projected/9d339190-1b2c-43d7-91fb-86ec63bee9c7-kube-api-access-gf27n\") pod \"nova-metadata-0\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " pod="openstack/nova-metadata-0" Mar 20 15:13:51 crc kubenswrapper[4764]: I0320 15:13:51.735498 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:13:52 crc kubenswrapper[4764]: I0320 15:13:52.208644 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:52 crc kubenswrapper[4764]: W0320 15:13:52.217579 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d339190_1b2c_43d7_91fb_86ec63bee9c7.slice/crio-5bbb4e1ade9041b7bd962b7670f5004dac66f55d134293a646808f24a0000e84 WatchSource:0}: Error finding container 5bbb4e1ade9041b7bd962b7670f5004dac66f55d134293a646808f24a0000e84: Status 404 returned error can't find the container with id 5bbb4e1ade9041b7bd962b7670f5004dac66f55d134293a646808f24a0000e84 Mar 20 15:13:52 crc kubenswrapper[4764]: I0320 15:13:52.315950 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d339190-1b2c-43d7-91fb-86ec63bee9c7","Type":"ContainerStarted","Data":"5bbb4e1ade9041b7bd962b7670f5004dac66f55d134293a646808f24a0000e84"} Mar 20 15:13:53 crc kubenswrapper[4764]: I0320 15:13:53.136038 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f6741cd-7933-46a6-b210-b801b26038e9" path="/var/lib/kubelet/pods/1f6741cd-7933-46a6-b210-b801b26038e9/volumes" Mar 20 15:13:53 crc kubenswrapper[4764]: I0320 15:13:53.338071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d339190-1b2c-43d7-91fb-86ec63bee9c7","Type":"ContainerStarted","Data":"2fbd0438563c952d720b877ea2c47f9f2178f878a5051b291b478824ada6dcb5"} Mar 20 15:13:53 crc kubenswrapper[4764]: I0320 15:13:53.338112 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d339190-1b2c-43d7-91fb-86ec63bee9c7","Type":"ContainerStarted","Data":"8a291a3d45e37389b19380d0ca98611e483f74521ca1a413679c44e810b1cc77"} Mar 20 15:13:53 crc kubenswrapper[4764]: I0320 15:13:53.362295 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.362275327 podStartE2EDuration="2.362275327s" podCreationTimestamp="2026-03-20 15:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:13:53.360255596 +0000 UTC m=+1354.976444735" watchObservedRunningTime="2026-03-20 15:13:53.362275327 +0000 UTC m=+1354.978464456" Mar 20 15:13:54 crc kubenswrapper[4764]: I0320 15:13:54.297313 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 15:13:54 crc kubenswrapper[4764]: I0320 15:13:54.348405 4764 generic.go:334] "Generic (PLEG): container finished" podID="15a21459-b477-4b13-847f-4997f3c4529f" containerID="64406e85d4e06b92e3255fc64afe33bd8499a8cd5401e852428f006ced123ff7" exitCode=0 Mar 20 15:13:54 crc kubenswrapper[4764]: I0320 15:13:54.348530 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cr8wd" event={"ID":"15a21459-b477-4b13-847f-4997f3c4529f","Type":"ContainerDied","Data":"64406e85d4e06b92e3255fc64afe33bd8499a8cd5401e852428f006ced123ff7"} Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.365201 4764 generic.go:334] "Generic (PLEG): container finished" podID="9a4d6564-31b4-4743-8acd-d1a431370201" containerID="e656f614766f094b6c759eccf2b240cdb5224a737c99af67801a7606c56a3898" exitCode=0 Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.365316 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k8k8m" event={"ID":"9a4d6564-31b4-4743-8acd-d1a431370201","Type":"ContainerDied","Data":"e656f614766f094b6c759eccf2b240cdb5224a737c99af67801a7606c56a3898"} Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.507295 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.507336 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.801101 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.806514 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.841481 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.846837 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.916090 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-b6dbs"] Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.916574 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" podUID="83217f94-75f8-4f9b-b9d1-4247c602cc26" containerName="dnsmasq-dns" containerID="cri-o://59d10a14895dddc75eda73f47e084a6ca150ba609f6efc06eb77693aefa513a8" gracePeriod=10 Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.986137 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-scripts\") pod \"15a21459-b477-4b13-847f-4997f3c4529f\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.986295 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-config-data\") pod \"15a21459-b477-4b13-847f-4997f3c4529f\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.986334 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngdr6\" (UniqueName: \"kubernetes.io/projected/15a21459-b477-4b13-847f-4997f3c4529f-kube-api-access-ngdr6\") pod \"15a21459-b477-4b13-847f-4997f3c4529f\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.986358 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-combined-ca-bundle\") pod \"15a21459-b477-4b13-847f-4997f3c4529f\" (UID: \"15a21459-b477-4b13-847f-4997f3c4529f\") " Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.993753 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a21459-b477-4b13-847f-4997f3c4529f-kube-api-access-ngdr6" (OuterVolumeSpecName: "kube-api-access-ngdr6") pod "15a21459-b477-4b13-847f-4997f3c4529f" (UID: "15a21459-b477-4b13-847f-4997f3c4529f"). InnerVolumeSpecName "kube-api-access-ngdr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:55 crc kubenswrapper[4764]: I0320 15:13:55.995309 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-scripts" (OuterVolumeSpecName: "scripts") pod "15a21459-b477-4b13-847f-4997f3c4529f" (UID: "15a21459-b477-4b13-847f-4997f3c4529f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.023640 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15a21459-b477-4b13-847f-4997f3c4529f" (UID: "15a21459-b477-4b13-847f-4997f3c4529f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.024578 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-config-data" (OuterVolumeSpecName: "config-data") pod "15a21459-b477-4b13-847f-4997f3c4529f" (UID: "15a21459-b477-4b13-847f-4997f3c4529f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.088832 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.088859 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.088869 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngdr6\" (UniqueName: \"kubernetes.io/projected/15a21459-b477-4b13-847f-4997f3c4529f-kube-api-access-ngdr6\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.088878 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a21459-b477-4b13-847f-4997f3c4529f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.379151 4764 generic.go:334] "Generic (PLEG): container finished" podID="83217f94-75f8-4f9b-b9d1-4247c602cc26" containerID="59d10a14895dddc75eda73f47e084a6ca150ba609f6efc06eb77693aefa513a8" exitCode=0 Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.379446 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" event={"ID":"83217f94-75f8-4f9b-b9d1-4247c602cc26","Type":"ContainerDied","Data":"59d10a14895dddc75eda73f47e084a6ca150ba609f6efc06eb77693aefa513a8"} Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.379471 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" event={"ID":"83217f94-75f8-4f9b-b9d1-4247c602cc26","Type":"ContainerDied","Data":"c9ece5a78905a13d97080b4c250ec4ba0694f65d72bff66f469f496e8b5954ee"} Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.379481 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9ece5a78905a13d97080b4c250ec4ba0694f65d72bff66f469f496e8b5954ee" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.381725 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cr8wd" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.382489 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cr8wd" event={"ID":"15a21459-b477-4b13-847f-4997f3c4529f","Type":"ContainerDied","Data":"465e69aac46156a4a78116d5e2a9aab0017222fa4d65760be0f3a961c470bb97"} Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.382545 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="465e69aac46156a4a78116d5e2a9aab0017222fa4d65760be0f3a961c470bb97" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.426585 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.433579 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.563848 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.564044 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" containerName="nova-api-log" containerID="cri-o://01c8782d7264a74c29d540468f39e72803570801ecf6a58f7fdbfc7f99a1f72f" gracePeriod=30 Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.564628 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" containerName="nova-api-api" containerID="cri-o://682152bd8dec83672f26bef33893f919ebd46ba3f0609609bf4964b314a37c7b" gracePeriod=30 Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.568299 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.568469 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.590342 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.590565 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9d339190-1b2c-43d7-91fb-86ec63bee9c7" containerName="nova-metadata-log" containerID="cri-o://8a291a3d45e37389b19380d0ca98611e483f74521ca1a413679c44e810b1cc77" gracePeriod=30 Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.590963 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9d339190-1b2c-43d7-91fb-86ec63bee9c7" containerName="nova-metadata-metadata" containerID="cri-o://2fbd0438563c952d720b877ea2c47f9f2178f878a5051b291b478824ada6dcb5" gracePeriod=30 Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.599277 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgchs\" (UniqueName: \"kubernetes.io/projected/83217f94-75f8-4f9b-b9d1-4247c602cc26-kube-api-access-fgchs\") pod \"83217f94-75f8-4f9b-b9d1-4247c602cc26\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.599339 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-dns-swift-storage-0\") pod \"83217f94-75f8-4f9b-b9d1-4247c602cc26\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.599457 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-ovsdbserver-nb\") pod \"83217f94-75f8-4f9b-b9d1-4247c602cc26\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.599478 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-dns-svc\") pod \"83217f94-75f8-4f9b-b9d1-4247c602cc26\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.599648 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-ovsdbserver-sb\") pod \"83217f94-75f8-4f9b-b9d1-4247c602cc26\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.599690 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-config\") pod \"83217f94-75f8-4f9b-b9d1-4247c602cc26\" (UID: \"83217f94-75f8-4f9b-b9d1-4247c602cc26\") " Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.607524 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83217f94-75f8-4f9b-b9d1-4247c602cc26-kube-api-access-fgchs" (OuterVolumeSpecName: "kube-api-access-fgchs") pod "83217f94-75f8-4f9b-b9d1-4247c602cc26" (UID: "83217f94-75f8-4f9b-b9d1-4247c602cc26"). InnerVolumeSpecName "kube-api-access-fgchs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.676949 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "83217f94-75f8-4f9b-b9d1-4247c602cc26" (UID: "83217f94-75f8-4f9b-b9d1-4247c602cc26"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.679432 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-config" (OuterVolumeSpecName: "config") pod "83217f94-75f8-4f9b-b9d1-4247c602cc26" (UID: "83217f94-75f8-4f9b-b9d1-4247c602cc26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.683227 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "83217f94-75f8-4f9b-b9d1-4247c602cc26" (UID: "83217f94-75f8-4f9b-b9d1-4247c602cc26"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.687298 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "83217f94-75f8-4f9b-b9d1-4247c602cc26" (UID: "83217f94-75f8-4f9b-b9d1-4247c602cc26"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.699453 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "83217f94-75f8-4f9b-b9d1-4247c602cc26" (UID: "83217f94-75f8-4f9b-b9d1-4247c602cc26"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.706083 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.706273 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgchs\" (UniqueName: \"kubernetes.io/projected/83217f94-75f8-4f9b-b9d1-4247c602cc26-kube-api-access-fgchs\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.717648 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.717686 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.717702 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.717714 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83217f94-75f8-4f9b-b9d1-4247c602cc26-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.881066 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:56 crc kubenswrapper[4764]: I0320 15:13:56.984460 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.026914 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtxgm\" (UniqueName: \"kubernetes.io/projected/9a4d6564-31b4-4743-8acd-d1a431370201-kube-api-access-qtxgm\") pod \"9a4d6564-31b4-4743-8acd-d1a431370201\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.027041 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-combined-ca-bundle\") pod \"9a4d6564-31b4-4743-8acd-d1a431370201\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.027237 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-scripts\") pod \"9a4d6564-31b4-4743-8acd-d1a431370201\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.027256 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-config-data\") pod \"9a4d6564-31b4-4743-8acd-d1a431370201\" (UID: \"9a4d6564-31b4-4743-8acd-d1a431370201\") " Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.038578 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a4d6564-31b4-4743-8acd-d1a431370201-kube-api-access-qtxgm" (OuterVolumeSpecName: "kube-api-access-qtxgm") pod "9a4d6564-31b4-4743-8acd-d1a431370201" (UID: "9a4d6564-31b4-4743-8acd-d1a431370201"). InnerVolumeSpecName "kube-api-access-qtxgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.041493 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-scripts" (OuterVolumeSpecName: "scripts") pod "9a4d6564-31b4-4743-8acd-d1a431370201" (UID: "9a4d6564-31b4-4743-8acd-d1a431370201"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.057678 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a4d6564-31b4-4743-8acd-d1a431370201" (UID: "9a4d6564-31b4-4743-8acd-d1a431370201"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.057933 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-config-data" (OuterVolumeSpecName: "config-data") pod "9a4d6564-31b4-4743-8acd-d1a431370201" (UID: "9a4d6564-31b4-4743-8acd-d1a431370201"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.129314 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.129458 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.129548 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtxgm\" (UniqueName: \"kubernetes.io/projected/9a4d6564-31b4-4743-8acd-d1a431370201-kube-api-access-qtxgm\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.129646 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a4d6564-31b4-4743-8acd-d1a431370201-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.391536 4764 generic.go:334] "Generic (PLEG): container finished" podID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" containerID="01c8782d7264a74c29d540468f39e72803570801ecf6a58f7fdbfc7f99a1f72f" exitCode=143 Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.391622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d","Type":"ContainerDied","Data":"01c8782d7264a74c29d540468f39e72803570801ecf6a58f7fdbfc7f99a1f72f"} Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.394186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-k8k8m" event={"ID":"9a4d6564-31b4-4743-8acd-d1a431370201","Type":"ContainerDied","Data":"3ca06a5c96234e798344ea65027bfc81128e8b93b45fbc477e6e1e02cfdad07f"} Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.394214 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ca06a5c96234e798344ea65027bfc81128e8b93b45fbc477e6e1e02cfdad07f" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.394239 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-k8k8m" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.397455 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d339190-1b2c-43d7-91fb-86ec63bee9c7" containerID="2fbd0438563c952d720b877ea2c47f9f2178f878a5051b291b478824ada6dcb5" exitCode=0 Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.397525 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d339190-1b2c-43d7-91fb-86ec63bee9c7" containerID="8a291a3d45e37389b19380d0ca98611e483f74521ca1a413679c44e810b1cc77" exitCode=143 Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.397703 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-b6dbs" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.398068 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d339190-1b2c-43d7-91fb-86ec63bee9c7","Type":"ContainerDied","Data":"2fbd0438563c952d720b877ea2c47f9f2178f878a5051b291b478824ada6dcb5"} Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.398124 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d339190-1b2c-43d7-91fb-86ec63bee9c7","Type":"ContainerDied","Data":"8a291a3d45e37389b19380d0ca98611e483f74521ca1a413679c44e810b1cc77"} Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.461403 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-b6dbs"] Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.467395 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-b6dbs"] Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.492688 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 15:13:57 crc kubenswrapper[4764]: E0320 15:13:57.493035 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a4d6564-31b4-4743-8acd-d1a431370201" containerName="nova-cell1-conductor-db-sync" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.493045 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a4d6564-31b4-4743-8acd-d1a431370201" containerName="nova-cell1-conductor-db-sync" Mar 20 15:13:57 crc kubenswrapper[4764]: E0320 15:13:57.493058 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a21459-b477-4b13-847f-4997f3c4529f" containerName="nova-manage" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.493065 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a21459-b477-4b13-847f-4997f3c4529f" containerName="nova-manage" Mar 20 15:13:57 crc kubenswrapper[4764]: E0320 15:13:57.493086 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83217f94-75f8-4f9b-b9d1-4247c602cc26" containerName="init" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.493093 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="83217f94-75f8-4f9b-b9d1-4247c602cc26" containerName="init" Mar 20 15:13:57 crc kubenswrapper[4764]: E0320 15:13:57.493105 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83217f94-75f8-4f9b-b9d1-4247c602cc26" containerName="dnsmasq-dns" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.493112 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="83217f94-75f8-4f9b-b9d1-4247c602cc26" containerName="dnsmasq-dns" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.493275 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a4d6564-31b4-4743-8acd-d1a431370201" containerName="nova-cell1-conductor-db-sync" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.493291 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a21459-b477-4b13-847f-4997f3c4529f" containerName="nova-manage" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.493303 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="83217f94-75f8-4f9b-b9d1-4247c602cc26" containerName="dnsmasq-dns" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.500145 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.505537 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.520108 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.606523 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.646659 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35ff2d6b-dfa9-41fe-9885-f71d494d6bab-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"35ff2d6b-dfa9-41fe-9885-f71d494d6bab\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.646952 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx9xn\" (UniqueName: \"kubernetes.io/projected/35ff2d6b-dfa9-41fe-9885-f71d494d6bab-kube-api-access-zx9xn\") pod \"nova-cell1-conductor-0\" (UID: \"35ff2d6b-dfa9-41fe-9885-f71d494d6bab\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.647101 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ff2d6b-dfa9-41fe-9885-f71d494d6bab-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"35ff2d6b-dfa9-41fe-9885-f71d494d6bab\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.748089 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-combined-ca-bundle\") pod \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.748169 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-config-data\") pod \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.748266 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d339190-1b2c-43d7-91fb-86ec63bee9c7-logs\") pod \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.748328 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf27n\" (UniqueName: \"kubernetes.io/projected/9d339190-1b2c-43d7-91fb-86ec63bee9c7-kube-api-access-gf27n\") pod \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.748356 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-nova-metadata-tls-certs\") pod \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\" (UID: \"9d339190-1b2c-43d7-91fb-86ec63bee9c7\") " Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.748804 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d339190-1b2c-43d7-91fb-86ec63bee9c7-logs" (OuterVolumeSpecName: "logs") pod "9d339190-1b2c-43d7-91fb-86ec63bee9c7" (UID: "9d339190-1b2c-43d7-91fb-86ec63bee9c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.749301 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35ff2d6b-dfa9-41fe-9885-f71d494d6bab-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"35ff2d6b-dfa9-41fe-9885-f71d494d6bab\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.749493 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx9xn\" (UniqueName: \"kubernetes.io/projected/35ff2d6b-dfa9-41fe-9885-f71d494d6bab-kube-api-access-zx9xn\") pod \"nova-cell1-conductor-0\" (UID: \"35ff2d6b-dfa9-41fe-9885-f71d494d6bab\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.749581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ff2d6b-dfa9-41fe-9885-f71d494d6bab-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"35ff2d6b-dfa9-41fe-9885-f71d494d6bab\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.749771 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d339190-1b2c-43d7-91fb-86ec63bee9c7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.753129 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ff2d6b-dfa9-41fe-9885-f71d494d6bab-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"35ff2d6b-dfa9-41fe-9885-f71d494d6bab\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.754524 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35ff2d6b-dfa9-41fe-9885-f71d494d6bab-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"35ff2d6b-dfa9-41fe-9885-f71d494d6bab\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.766321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d339190-1b2c-43d7-91fb-86ec63bee9c7-kube-api-access-gf27n" (OuterVolumeSpecName: "kube-api-access-gf27n") pod "9d339190-1b2c-43d7-91fb-86ec63bee9c7" (UID: "9d339190-1b2c-43d7-91fb-86ec63bee9c7"). InnerVolumeSpecName "kube-api-access-gf27n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.772748 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx9xn\" (UniqueName: \"kubernetes.io/projected/35ff2d6b-dfa9-41fe-9885-f71d494d6bab-kube-api-access-zx9xn\") pod \"nova-cell1-conductor-0\" (UID: \"35ff2d6b-dfa9-41fe-9885-f71d494d6bab\") " pod="openstack/nova-cell1-conductor-0" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.778052 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-config-data" (OuterVolumeSpecName: "config-data") pod "9d339190-1b2c-43d7-91fb-86ec63bee9c7" (UID: "9d339190-1b2c-43d7-91fb-86ec63bee9c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.786822 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d339190-1b2c-43d7-91fb-86ec63bee9c7" (UID: "9d339190-1b2c-43d7-91fb-86ec63bee9c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.802580 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9d339190-1b2c-43d7-91fb-86ec63bee9c7" (UID: "9d339190-1b2c-43d7-91fb-86ec63bee9c7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.846086 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.851281 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.851302 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.851316 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf27n\" (UniqueName: \"kubernetes.io/projected/9d339190-1b2c-43d7-91fb-86ec63bee9c7-kube-api-access-gf27n\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:57 crc kubenswrapper[4764]: I0320 15:13:57.851326 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d339190-1b2c-43d7-91fb-86ec63bee9c7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:58 crc kubenswrapper[4764]: W0320 15:13:58.269064 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35ff2d6b_dfa9_41fe_9885_f71d494d6bab.slice/crio-23f93cbc990c8f1d57c77dffa61a38bc2b188c0ddffe107c892353fe33c6081b WatchSource:0}: Error finding container 23f93cbc990c8f1d57c77dffa61a38bc2b188c0ddffe107c892353fe33c6081b: Status 404 returned error can't find the container with id 23f93cbc990c8f1d57c77dffa61a38bc2b188c0ddffe107c892353fe33c6081b Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.275034 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.410237 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d339190-1b2c-43d7-91fb-86ec63bee9c7","Type":"ContainerDied","Data":"5bbb4e1ade9041b7bd962b7670f5004dac66f55d134293a646808f24a0000e84"} Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.410284 4764 scope.go:117] "RemoveContainer" containerID="2fbd0438563c952d720b877ea2c47f9f2178f878a5051b291b478824ada6dcb5" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.410408 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.415408 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="570012fa-4a72-4a1e-905d-3e46c21da637" containerName="nova-scheduler-scheduler" containerID="cri-o://2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea" gracePeriod=30 Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.415529 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"35ff2d6b-dfa9-41fe-9885-f71d494d6bab","Type":"ContainerStarted","Data":"23f93cbc990c8f1d57c77dffa61a38bc2b188c0ddffe107c892353fe33c6081b"} Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.432864 4764 scope.go:117] "RemoveContainer" containerID="8a291a3d45e37389b19380d0ca98611e483f74521ca1a413679c44e810b1cc77" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.469016 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.479424 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.486217 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:58 crc kubenswrapper[4764]: E0320 15:13:58.486629 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d339190-1b2c-43d7-91fb-86ec63bee9c7" containerName="nova-metadata-metadata" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.486645 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d339190-1b2c-43d7-91fb-86ec63bee9c7" containerName="nova-metadata-metadata" Mar 20 15:13:58 crc kubenswrapper[4764]: E0320 15:13:58.486664 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d339190-1b2c-43d7-91fb-86ec63bee9c7" containerName="nova-metadata-log" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.486670 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d339190-1b2c-43d7-91fb-86ec63bee9c7" containerName="nova-metadata-log" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.486842 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d339190-1b2c-43d7-91fb-86ec63bee9c7" containerName="nova-metadata-log" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.486855 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d339190-1b2c-43d7-91fb-86ec63bee9c7" containerName="nova-metadata-metadata" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.487738 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.490183 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.490802 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.502106 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.664213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/020b2b74-9e86-4e2f-804c-3c7595dd2899-logs\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.664644 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.664689 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh69w\" (UniqueName: \"kubernetes.io/projected/020b2b74-9e86-4e2f-804c-3c7595dd2899-kube-api-access-hh69w\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.664723 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.665061 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-config-data\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.767207 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-config-data\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.767283 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/020b2b74-9e86-4e2f-804c-3c7595dd2899-logs\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.767336 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.767362 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh69w\" (UniqueName: \"kubernetes.io/projected/020b2b74-9e86-4e2f-804c-3c7595dd2899-kube-api-access-hh69w\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.767402 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.768399 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/020b2b74-9e86-4e2f-804c-3c7595dd2899-logs\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.773954 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-config-data\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.774521 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.780007 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.787963 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh69w\" (UniqueName: \"kubernetes.io/projected/020b2b74-9e86-4e2f-804c-3c7595dd2899-kube-api-access-hh69w\") pod \"nova-metadata-0\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " pod="openstack/nova-metadata-0" Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.801742 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.801986 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a9084836-71c4-46c4-9cec-f2f2a5489914" containerName="kube-state-metrics" containerID="cri-o://5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8" gracePeriod=30 Mar 20 15:13:58 crc kubenswrapper[4764]: I0320 15:13:58.809566 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.162104 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83217f94-75f8-4f9b-b9d1-4247c602cc26" path="/var/lib/kubelet/pods/83217f94-75f8-4f9b-b9d1-4247c602cc26/volumes" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.171432 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d339190-1b2c-43d7-91fb-86ec63bee9c7" path="/var/lib/kubelet/pods/9d339190-1b2c-43d7-91fb-86ec63bee9c7/volumes" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.313585 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.327437 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.425041 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"35ff2d6b-dfa9-41fe-9885-f71d494d6bab","Type":"ContainerStarted","Data":"77b7e956740cb4fbb70ae5d713f7cb4d4671d15ea45f569b5a6f97090709b6f4"} Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.425993 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.435262 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"020b2b74-9e86-4e2f-804c-3c7595dd2899","Type":"ContainerStarted","Data":"9974eff12c88d8d14dd42a0a8a97cb85375b07d7e145583349bd5f934cbe3a98"} Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.439899 4764 generic.go:334] "Generic (PLEG): container finished" podID="a9084836-71c4-46c4-9cec-f2f2a5489914" containerID="5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8" exitCode=2 Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.439940 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a9084836-71c4-46c4-9cec-f2f2a5489914","Type":"ContainerDied","Data":"5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8"} Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.439963 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a9084836-71c4-46c4-9cec-f2f2a5489914","Type":"ContainerDied","Data":"cbad95d598646486a4852eb0ce8d4b3e6a5193c6bf5b2be7910a6bf426ce26cb"} Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.439978 4764 scope.go:117] "RemoveContainer" containerID="5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.440065 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.447165 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.447148391 podStartE2EDuration="2.447148391s" podCreationTimestamp="2026-03-20 15:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:13:59.439426156 +0000 UTC m=+1361.055615285" watchObservedRunningTime="2026-03-20 15:13:59.447148391 +0000 UTC m=+1361.063337520" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.469239 4764 scope.go:117] "RemoveContainer" containerID="5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8" Mar 20 15:13:59 crc kubenswrapper[4764]: E0320 15:13:59.469746 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8\": container with ID starting with 5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8 not found: ID does not exist" containerID="5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.469793 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8"} err="failed to get container status \"5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8\": rpc error: code = NotFound desc = could not find container \"5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8\": container with ID starting with 5a019c240e5998464d077009d581a618dd52241dc5e6d2d4efa0134f7b15f8d8 not found: ID does not exist" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.487583 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w7z4\" (UniqueName: \"kubernetes.io/projected/a9084836-71c4-46c4-9cec-f2f2a5489914-kube-api-access-9w7z4\") pod \"a9084836-71c4-46c4-9cec-f2f2a5489914\" (UID: \"a9084836-71c4-46c4-9cec-f2f2a5489914\") " Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.495547 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9084836-71c4-46c4-9cec-f2f2a5489914-kube-api-access-9w7z4" (OuterVolumeSpecName: "kube-api-access-9w7z4") pod "a9084836-71c4-46c4-9cec-f2f2a5489914" (UID: "a9084836-71c4-46c4-9cec-f2f2a5489914"). InnerVolumeSpecName "kube-api-access-9w7z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.589846 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w7z4\" (UniqueName: \"kubernetes.io/projected/a9084836-71c4-46c4-9cec-f2f2a5489914-kube-api-access-9w7z4\") on node \"crc\" DevicePath \"\"" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.804449 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.812153 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.827600 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:13:59 crc kubenswrapper[4764]: E0320 15:13:59.827941 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9084836-71c4-46c4-9cec-f2f2a5489914" containerName="kube-state-metrics" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.827955 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9084836-71c4-46c4-9cec-f2f2a5489914" containerName="kube-state-metrics" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.828124 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9084836-71c4-46c4-9cec-f2f2a5489914" containerName="kube-state-metrics" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.829043 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.835741 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.836046 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.865396 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.996508 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f33a933-0b09-4945-8d48-449ea849f7e1-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5f33a933-0b09-4945-8d48-449ea849f7e1\") " pod="openstack/kube-state-metrics-0" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.996710 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5f33a933-0b09-4945-8d48-449ea849f7e1-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5f33a933-0b09-4945-8d48-449ea849f7e1\") " pod="openstack/kube-state-metrics-0" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.996760 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f33a933-0b09-4945-8d48-449ea849f7e1-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5f33a933-0b09-4945-8d48-449ea849f7e1\") " pod="openstack/kube-state-metrics-0" Mar 20 15:13:59 crc kubenswrapper[4764]: I0320 15:13:59.996792 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw7pp\" (UniqueName: \"kubernetes.io/projected/5f33a933-0b09-4945-8d48-449ea849f7e1-kube-api-access-kw7pp\") pod \"kube-state-metrics-0\" (UID: \"5f33a933-0b09-4945-8d48-449ea849f7e1\") " pod="openstack/kube-state-metrics-0" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.098092 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5f33a933-0b09-4945-8d48-449ea849f7e1-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5f33a933-0b09-4945-8d48-449ea849f7e1\") " pod="openstack/kube-state-metrics-0" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.098175 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f33a933-0b09-4945-8d48-449ea849f7e1-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5f33a933-0b09-4945-8d48-449ea849f7e1\") " pod="openstack/kube-state-metrics-0" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.098212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw7pp\" (UniqueName: \"kubernetes.io/projected/5f33a933-0b09-4945-8d48-449ea849f7e1-kube-api-access-kw7pp\") pod \"kube-state-metrics-0\" (UID: \"5f33a933-0b09-4945-8d48-449ea849f7e1\") " pod="openstack/kube-state-metrics-0" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.098262 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f33a933-0b09-4945-8d48-449ea849f7e1-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5f33a933-0b09-4945-8d48-449ea849f7e1\") " pod="openstack/kube-state-metrics-0" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.102827 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5f33a933-0b09-4945-8d48-449ea849f7e1-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5f33a933-0b09-4945-8d48-449ea849f7e1\") " pod="openstack/kube-state-metrics-0" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.106889 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f33a933-0b09-4945-8d48-449ea849f7e1-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5f33a933-0b09-4945-8d48-449ea849f7e1\") " pod="openstack/kube-state-metrics-0" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.118075 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw7pp\" (UniqueName: \"kubernetes.io/projected/5f33a933-0b09-4945-8d48-449ea849f7e1-kube-api-access-kw7pp\") pod \"kube-state-metrics-0\" (UID: \"5f33a933-0b09-4945-8d48-449ea849f7e1\") " pod="openstack/kube-state-metrics-0" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.118127 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f33a933-0b09-4945-8d48-449ea849f7e1-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5f33a933-0b09-4945-8d48-449ea849f7e1\") " pod="openstack/kube-state-metrics-0" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.138013 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566994-9t58k"] Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.139158 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566994-9t58k" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.149343 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566994-9t58k"] Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.151000 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.151641 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.151773 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.200416 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.305794 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-789gd\" (UniqueName: \"kubernetes.io/projected/0b25e8b0-ce3b-48ff-81df-4589b0ec17ea-kube-api-access-789gd\") pod \"auto-csr-approver-29566994-9t58k\" (UID: \"0b25e8b0-ce3b-48ff-81df-4589b0ec17ea\") " pod="openshift-infra/auto-csr-approver-29566994-9t58k" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.407354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-789gd\" (UniqueName: \"kubernetes.io/projected/0b25e8b0-ce3b-48ff-81df-4589b0ec17ea-kube-api-access-789gd\") pod \"auto-csr-approver-29566994-9t58k\" (UID: \"0b25e8b0-ce3b-48ff-81df-4589b0ec17ea\") " pod="openshift-infra/auto-csr-approver-29566994-9t58k" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.425986 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-789gd\" (UniqueName: \"kubernetes.io/projected/0b25e8b0-ce3b-48ff-81df-4589b0ec17ea-kube-api-access-789gd\") pod \"auto-csr-approver-29566994-9t58k\" (UID: \"0b25e8b0-ce3b-48ff-81df-4589b0ec17ea\") " pod="openshift-infra/auto-csr-approver-29566994-9t58k" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.452273 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"020b2b74-9e86-4e2f-804c-3c7595dd2899","Type":"ContainerStarted","Data":"615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e"} Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.452314 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"020b2b74-9e86-4e2f-804c-3c7595dd2899","Type":"ContainerStarted","Data":"781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7"} Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.475161 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.475145266 podStartE2EDuration="2.475145266s" podCreationTimestamp="2026-03-20 15:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:14:00.469012939 +0000 UTC m=+1362.085202068" watchObservedRunningTime="2026-03-20 15:14:00.475145266 +0000 UTC m=+1362.091334385" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.655367 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566994-9t58k" Mar 20 15:14:00 crc kubenswrapper[4764]: I0320 15:14:00.706760 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 15:14:00 crc kubenswrapper[4764]: E0320 15:14:00.822886 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 15:14:00 crc kubenswrapper[4764]: E0320 15:14:00.824641 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 15:14:00 crc kubenswrapper[4764]: E0320 15:14:00.825819 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 15:14:00 crc kubenswrapper[4764]: E0320 15:14:00.825862 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="570012fa-4a72-4a1e-905d-3e46c21da637" containerName="nova-scheduler-scheduler" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.002448 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.002741 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="ceilometer-central-agent" containerID="cri-o://c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d" gracePeriod=30 Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.003130 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="proxy-httpd" containerID="cri-o://4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389" gracePeriod=30 Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.003172 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="sg-core" containerID="cri-o://854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717" gracePeriod=30 Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.003197 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="ceilometer-notification-agent" containerID="cri-o://bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477" gracePeriod=30 Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.140566 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9084836-71c4-46c4-9cec-f2f2a5489914" path="/var/lib/kubelet/pods/a9084836-71c4-46c4-9cec-f2f2a5489914/volumes" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.160928 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566994-9t58k"] Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.245526 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.444182 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9jnc\" (UniqueName: \"kubernetes.io/projected/570012fa-4a72-4a1e-905d-3e46c21da637-kube-api-access-p9jnc\") pod \"570012fa-4a72-4a1e-905d-3e46c21da637\" (UID: \"570012fa-4a72-4a1e-905d-3e46c21da637\") " Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.444254 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570012fa-4a72-4a1e-905d-3e46c21da637-combined-ca-bundle\") pod \"570012fa-4a72-4a1e-905d-3e46c21da637\" (UID: \"570012fa-4a72-4a1e-905d-3e46c21da637\") " Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.444535 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570012fa-4a72-4a1e-905d-3e46c21da637-config-data\") pod \"570012fa-4a72-4a1e-905d-3e46c21da637\" (UID: \"570012fa-4a72-4a1e-905d-3e46c21da637\") " Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.449454 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/570012fa-4a72-4a1e-905d-3e46c21da637-kube-api-access-p9jnc" (OuterVolumeSpecName: "kube-api-access-p9jnc") pod "570012fa-4a72-4a1e-905d-3e46c21da637" (UID: "570012fa-4a72-4a1e-905d-3e46c21da637"). InnerVolumeSpecName "kube-api-access-p9jnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.469800 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570012fa-4a72-4a1e-905d-3e46c21da637-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "570012fa-4a72-4a1e-905d-3e46c21da637" (UID: "570012fa-4a72-4a1e-905d-3e46c21da637"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.470731 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5f33a933-0b09-4945-8d48-449ea849f7e1","Type":"ContainerStarted","Data":"33f3bf3c5063a127cbdc2be3d58609728320ebf5fa94af37d946431442e2c0dc"} Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.470808 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5f33a933-0b09-4945-8d48-449ea849f7e1","Type":"ContainerStarted","Data":"bed3ba12b72a93efbde1e7874c1c88cb0e5a38178c9e55857f4f77f1ca6842ac"} Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.470895 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.473498 4764 generic.go:334] "Generic (PLEG): container finished" podID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerID="4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389" exitCode=0 Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.473531 4764 generic.go:334] "Generic (PLEG): container finished" podID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerID="854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717" exitCode=2 Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.473566 4764 generic.go:334] "Generic (PLEG): container finished" podID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerID="c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d" exitCode=0 Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.473615 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b01f9f5c-0c39-4c1c-9052-21c421fd55bd","Type":"ContainerDied","Data":"4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389"} Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.473667 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b01f9f5c-0c39-4c1c-9052-21c421fd55bd","Type":"ContainerDied","Data":"854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717"} Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.473681 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b01f9f5c-0c39-4c1c-9052-21c421fd55bd","Type":"ContainerDied","Data":"c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d"} Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.475829 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566994-9t58k" event={"ID":"0b25e8b0-ce3b-48ff-81df-4589b0ec17ea","Type":"ContainerStarted","Data":"d4b5cf223168178f167fe60d552f456eaffe7701a55b1e88750143fdbd9b4f0b"} Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.476206 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570012fa-4a72-4a1e-905d-3e46c21da637-config-data" (OuterVolumeSpecName: "config-data") pod "570012fa-4a72-4a1e-905d-3e46c21da637" (UID: "570012fa-4a72-4a1e-905d-3e46c21da637"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.477932 4764 generic.go:334] "Generic (PLEG): container finished" podID="570012fa-4a72-4a1e-905d-3e46c21da637" containerID="2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea" exitCode=0 Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.477997 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.477980 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"570012fa-4a72-4a1e-905d-3e46c21da637","Type":"ContainerDied","Data":"2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea"} Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.478069 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"570012fa-4a72-4a1e-905d-3e46c21da637","Type":"ContainerDied","Data":"8cdba5767c08d3e4f7bc938ce460e87366186f92942495e08bd23e9763463970"} Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.478099 4764 scope.go:117] "RemoveContainer" containerID="2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.488758 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.071177389 podStartE2EDuration="2.488745422s" podCreationTimestamp="2026-03-20 15:13:59 +0000 UTC" firstStartedPulling="2026-03-20 15:14:00.746942606 +0000 UTC m=+1362.363131735" lastFinishedPulling="2026-03-20 15:14:01.164510639 +0000 UTC m=+1362.780699768" observedRunningTime="2026-03-20 15:14:01.48802007 +0000 UTC m=+1363.104209209" watchObservedRunningTime="2026-03-20 15:14:01.488745422 +0000 UTC m=+1363.104934551" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.499322 4764 scope.go:117] "RemoveContainer" containerID="2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea" Mar 20 15:14:01 crc kubenswrapper[4764]: E0320 15:14:01.499879 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea\": container with ID starting with 2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea not found: ID does not exist" containerID="2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.499935 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea"} err="failed to get container status \"2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea\": rpc error: code = NotFound desc = could not find container \"2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea\": container with ID starting with 2b25a36b5a5a634477122fc095f289f4d3ff6d0ac0e945f479a97b96314fc9ea not found: ID does not exist" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.517932 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.531756 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.549728 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570012fa-4a72-4a1e-905d-3e46c21da637-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.549966 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9jnc\" (UniqueName: \"kubernetes.io/projected/570012fa-4a72-4a1e-905d-3e46c21da637-kube-api-access-p9jnc\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.550039 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570012fa-4a72-4a1e-905d-3e46c21da637-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.558073 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:14:01 crc kubenswrapper[4764]: E0320 15:14:01.558616 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="570012fa-4a72-4a1e-905d-3e46c21da637" containerName="nova-scheduler-scheduler" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.558692 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="570012fa-4a72-4a1e-905d-3e46c21da637" containerName="nova-scheduler-scheduler" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.558934 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="570012fa-4a72-4a1e-905d-3e46c21da637" containerName="nova-scheduler-scheduler" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.559517 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.562344 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.571942 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:14:01 crc kubenswrapper[4764]: E0320 15:14:01.733559 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod570012fa_4a72_4a1e_905d_3e46c21da637.slice\": RecentStats: unable to find data in memory cache]" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.756908 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j94xg\" (UniqueName: \"kubernetes.io/projected/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-kube-api-access-j94xg\") pod \"nova-scheduler-0\" (UID: \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.756981 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.758298 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-config-data\") pod \"nova-scheduler-0\" (UID: \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.860904 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j94xg\" (UniqueName: \"kubernetes.io/projected/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-kube-api-access-j94xg\") pod \"nova-scheduler-0\" (UID: \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.861082 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.861281 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-config-data\") pod \"nova-scheduler-0\" (UID: \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.866124 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.876780 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-config-data\") pod \"nova-scheduler-0\" (UID: \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:01 crc kubenswrapper[4764]: I0320 15:14:01.887824 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j94xg\" (UniqueName: \"kubernetes.io/projected/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-kube-api-access-j94xg\") pod \"nova-scheduler-0\" (UID: \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:02 crc kubenswrapper[4764]: I0320 15:14:02.177857 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:14:02 crc kubenswrapper[4764]: I0320 15:14:02.367931 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.470069 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-log-httpd\") pod \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.470106 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-run-httpd\") pod \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.470152 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-scripts\") pod \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.470502 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-combined-ca-bundle\") pod \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.470531 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qblzc\" (UniqueName: \"kubernetes.io/projected/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-kube-api-access-qblzc\") pod \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.470742 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-sg-core-conf-yaml\") pod \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.470786 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-config-data\") pod \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\" (UID: \"b01f9f5c-0c39-4c1c-9052-21c421fd55bd\") " Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.471451 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b01f9f5c-0c39-4c1c-9052-21c421fd55bd" (UID: "b01f9f5c-0c39-4c1c-9052-21c421fd55bd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.472109 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b01f9f5c-0c39-4c1c-9052-21c421fd55bd" (UID: "b01f9f5c-0c39-4c1c-9052-21c421fd55bd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.472112 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.476264 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-kube-api-access-qblzc" (OuterVolumeSpecName: "kube-api-access-qblzc") pod "b01f9f5c-0c39-4c1c-9052-21c421fd55bd" (UID: "b01f9f5c-0c39-4c1c-9052-21c421fd55bd"). InnerVolumeSpecName "kube-api-access-qblzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.481366 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-scripts" (OuterVolumeSpecName: "scripts") pod "b01f9f5c-0c39-4c1c-9052-21c421fd55bd" (UID: "b01f9f5c-0c39-4c1c-9052-21c421fd55bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.496198 4764 generic.go:334] "Generic (PLEG): container finished" podID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" containerID="682152bd8dec83672f26bef33893f919ebd46ba3f0609609bf4964b314a37c7b" exitCode=0 Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.496276 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d","Type":"ContainerDied","Data":"682152bd8dec83672f26bef33893f919ebd46ba3f0609609bf4964b314a37c7b"} Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.498543 4764 generic.go:334] "Generic (PLEG): container finished" podID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerID="bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477" exitCode=0 Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.499477 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.500012 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b01f9f5c-0c39-4c1c-9052-21c421fd55bd","Type":"ContainerDied","Data":"bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477"} Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.500034 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b01f9f5c-0c39-4c1c-9052-21c421fd55bd","Type":"ContainerDied","Data":"f4497b611d0052ddefba78208add4b9d3dd101b374af3825351b3d2e7b8679f4"} Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.500053 4764 scope.go:117] "RemoveContainer" containerID="4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.503036 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.524456 4764 scope.go:117] "RemoveContainer" containerID="854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.534671 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b01f9f5c-0c39-4c1c-9052-21c421fd55bd" (UID: "b01f9f5c-0c39-4c1c-9052-21c421fd55bd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.543554 4764 scope.go:117] "RemoveContainer" containerID="bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.570460 4764 scope.go:117] "RemoveContainer" containerID="c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.573589 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-combined-ca-bundle\") pod \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.573836 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-config-data\") pod \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.573884 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x9sp\" (UniqueName: \"kubernetes.io/projected/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-kube-api-access-4x9sp\") pod \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.573910 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-logs\") pod \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\" (UID: \"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d\") " Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.574344 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.574356 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.574364 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qblzc\" (UniqueName: \"kubernetes.io/projected/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-kube-api-access-qblzc\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.574374 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.574577 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-logs" (OuterVolumeSpecName: "logs") pod "aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" (UID: "aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.578188 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-kube-api-access-4x9sp" (OuterVolumeSpecName: "kube-api-access-4x9sp") pod "aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" (UID: "aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d"). InnerVolumeSpecName "kube-api-access-4x9sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.589103 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b01f9f5c-0c39-4c1c-9052-21c421fd55bd" (UID: "b01f9f5c-0c39-4c1c-9052-21c421fd55bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.592918 4764 scope.go:117] "RemoveContainer" containerID="4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389" Mar 20 15:14:03 crc kubenswrapper[4764]: E0320 15:14:02.593272 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389\": container with ID starting with 4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389 not found: ID does not exist" containerID="4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.593307 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389"} err="failed to get container status \"4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389\": rpc error: code = NotFound desc = could not find container \"4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389\": container with ID starting with 4c118ae57366aa44a0ea81d0575d897683ac5e16e38b9de721bcabc322741389 not found: ID does not exist" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.593333 4764 scope.go:117] "RemoveContainer" containerID="854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717" Mar 20 15:14:03 crc kubenswrapper[4764]: E0320 15:14:02.593662 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717\": container with ID starting with 854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717 not found: ID does not exist" containerID="854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.593697 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717"} err="failed to get container status \"854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717\": rpc error: code = NotFound desc = could not find container \"854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717\": container with ID starting with 854932a62844c1d53c42ecc0c87f6734f70c034f2ca196b9d1653db1ccb5e717 not found: ID does not exist" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.593719 4764 scope.go:117] "RemoveContainer" containerID="bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477" Mar 20 15:14:03 crc kubenswrapper[4764]: E0320 15:14:02.594019 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477\": container with ID starting with bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477 not found: ID does not exist" containerID="bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.594046 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477"} err="failed to get container status \"bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477\": rpc error: code = NotFound desc = could not find container \"bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477\": container with ID starting with bcc69cd16dd8f6fbcf113500660824f1f32cb0f3bbc550e7335b2c73c02b9477 not found: ID does not exist" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.594064 4764 scope.go:117] "RemoveContainer" containerID="c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d" Mar 20 15:14:03 crc kubenswrapper[4764]: E0320 15:14:02.594236 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d\": container with ID starting with c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d not found: ID does not exist" containerID="c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.594270 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d"} err="failed to get container status \"c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d\": rpc error: code = NotFound desc = could not find container \"c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d\": container with ID starting with c9537695917c18d346e953254ab65e598668de774f176bcd24a6967b2f95cb6d not found: ID does not exist" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.596027 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-config-data" (OuterVolumeSpecName: "config-data") pod "b01f9f5c-0c39-4c1c-9052-21c421fd55bd" (UID: "b01f9f5c-0c39-4c1c-9052-21c421fd55bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.598120 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-config-data" (OuterVolumeSpecName: "config-data") pod "aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" (UID: "aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.602715 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" (UID: "aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.674888 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.674915 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b01f9f5c-0c39-4c1c-9052-21c421fd55bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.674926 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.674935 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x9sp\" (UniqueName: \"kubernetes.io/projected/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-kube-api-access-4x9sp\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.674944 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.674955 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.705504 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:02.998877 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.011934 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.018213 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:03 crc kubenswrapper[4764]: E0320 15:14:03.018793 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="ceilometer-notification-agent" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.018811 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="ceilometer-notification-agent" Mar 20 15:14:03 crc kubenswrapper[4764]: E0320 15:14:03.018843 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="sg-core" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.018851 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="sg-core" Mar 20 15:14:03 crc kubenswrapper[4764]: E0320 15:14:03.018869 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" containerName="nova-api-log" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.018877 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" containerName="nova-api-log" Mar 20 15:14:03 crc kubenswrapper[4764]: E0320 15:14:03.018889 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="proxy-httpd" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.018896 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="proxy-httpd" Mar 20 15:14:03 crc kubenswrapper[4764]: E0320 15:14:03.018918 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="ceilometer-central-agent" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.018925 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="ceilometer-central-agent" Mar 20 15:14:03 crc kubenswrapper[4764]: E0320 15:14:03.018937 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" containerName="nova-api-api" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.018943 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" containerName="nova-api-api" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.019178 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="sg-core" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.019199 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="proxy-httpd" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.019218 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="ceilometer-central-agent" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.019232 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" containerName="nova-api-log" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.019248 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" containerName="ceilometer-notification-agent" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.019259 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" containerName="nova-api-api" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.021447 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.024489 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.024680 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.024970 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.040706 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.085470 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-scripts\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.085543 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.085602 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-config-data\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.085658 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.085707 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh2jt\" (UniqueName: \"kubernetes.io/projected/2bb4537e-8338-464e-9770-ec2afac2e0c9-kube-api-access-qh2jt\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.085754 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.085773 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb4537e-8338-464e-9770-ec2afac2e0c9-run-httpd\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.085796 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb4537e-8338-464e-9770-ec2afac2e0c9-log-httpd\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.137065 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="570012fa-4a72-4a1e-905d-3e46c21da637" path="/var/lib/kubelet/pods/570012fa-4a72-4a1e-905d-3e46c21da637/volumes" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.137799 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b01f9f5c-0c39-4c1c-9052-21c421fd55bd" path="/var/lib/kubelet/pods/b01f9f5c-0c39-4c1c-9052-21c421fd55bd/volumes" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.200181 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.200268 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb4537e-8338-464e-9770-ec2afac2e0c9-run-httpd\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.200321 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb4537e-8338-464e-9770-ec2afac2e0c9-log-httpd\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.200410 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-scripts\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.200877 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.200995 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb4537e-8338-464e-9770-ec2afac2e0c9-log-httpd\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.201310 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb4537e-8338-464e-9770-ec2afac2e0c9-run-httpd\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.201813 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-config-data\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.201916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.201977 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh2jt\" (UniqueName: \"kubernetes.io/projected/2bb4537e-8338-464e-9770-ec2afac2e0c9-kube-api-access-qh2jt\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.208217 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.208951 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-config-data\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.223077 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.223961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-scripts\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.230438 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.230591 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh2jt\" (UniqueName: \"kubernetes.io/projected/2bb4537e-8338-464e-9770-ec2afac2e0c9-kube-api-access-qh2jt\") pod \"ceilometer-0\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.345921 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.519921 4764 generic.go:334] "Generic (PLEG): container finished" podID="0b25e8b0-ce3b-48ff-81df-4589b0ec17ea" containerID="12add6006235e427d2ecedb6615911851169993dbc5e086bcd33a587843c6ce5" exitCode=0 Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.519984 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566994-9t58k" event={"ID":"0b25e8b0-ce3b-48ff-81df-4589b0ec17ea","Type":"ContainerDied","Data":"12add6006235e427d2ecedb6615911851169993dbc5e086bcd33a587843c6ce5"} Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.525402 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d","Type":"ContainerDied","Data":"a498f1d783b096f5242c4e24a3d880a5018b96fa69cb776d96ef70bee7277158"} Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.525450 4764 scope.go:117] "RemoveContainer" containerID="682152bd8dec83672f26bef33893f919ebd46ba3f0609609bf4964b314a37c7b" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.525557 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.537109 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2d1ead43-f1f4-41ba-8de0-bf1c1f386272","Type":"ContainerStarted","Data":"8adadf8afe223d4f284ab38ce63b2241b697f0a81195f348367b20dee2e7bdd4"} Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.537156 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2d1ead43-f1f4-41ba-8de0-bf1c1f386272","Type":"ContainerStarted","Data":"cf46cbe118f6350a6c72d18a1e3b11b76605980944ea2e8ad150395dfcf242c0"} Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.565651 4764 scope.go:117] "RemoveContainer" containerID="01c8782d7264a74c29d540468f39e72803570801ecf6a58f7fdbfc7f99a1f72f" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.569836 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.569816168 podStartE2EDuration="2.569816168s" podCreationTimestamp="2026-03-20 15:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:14:03.564820605 +0000 UTC m=+1365.181009734" watchObservedRunningTime="2026-03-20 15:14:03.569816168 +0000 UTC m=+1365.186005287" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.608505 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.624889 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.638779 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:03 crc kubenswrapper[4764]: W0320 15:14:03.640604 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bb4537e_8338_464e_9770_ec2afac2e0c9.slice/crio-0d8fa00b97a719f611f7d41161befe65fd77ee6dcdd28f62a28eee5f6d5bbbc6 WatchSource:0}: Error finding container 0d8fa00b97a719f611f7d41161befe65fd77ee6dcdd28f62a28eee5f6d5bbbc6: Status 404 returned error can't find the container with id 0d8fa00b97a719f611f7d41161befe65fd77ee6dcdd28f62a28eee5f6d5bbbc6 Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.641456 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.644000 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.650549 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.663367 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.710999 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.711047 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwv5r\" (UniqueName: \"kubernetes.io/projected/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-kube-api-access-dwv5r\") pod \"nova-api-0\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.711071 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-logs\") pod \"nova-api-0\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.711132 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-config-data\") pod \"nova-api-0\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.813276 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-config-data\") pod \"nova-api-0\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.813401 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.813456 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwv5r\" (UniqueName: \"kubernetes.io/projected/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-kube-api-access-dwv5r\") pod \"nova-api-0\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.813482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-logs\") pod \"nova-api-0\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.814995 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-logs\") pod \"nova-api-0\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.820791 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.827988 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-config-data\") pod \"nova-api-0\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.830563 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwv5r\" (UniqueName: \"kubernetes.io/projected/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-kube-api-access-dwv5r\") pod \"nova-api-0\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " pod="openstack/nova-api-0" Mar 20 15:14:03 crc kubenswrapper[4764]: I0320 15:14:03.982784 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:14:04 crc kubenswrapper[4764]: I0320 15:14:04.473064 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:04 crc kubenswrapper[4764]: W0320 15:14:04.473497 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbb122ca_f6cc_46fb_a2f8_f866f4c311da.slice/crio-b8b82edc4016077ae623e5bfa1723518624833c627a2af4d23edf14db44af472 WatchSource:0}: Error finding container b8b82edc4016077ae623e5bfa1723518624833c627a2af4d23edf14db44af472: Status 404 returned error can't find the container with id b8b82edc4016077ae623e5bfa1723518624833c627a2af4d23edf14db44af472 Mar 20 15:14:04 crc kubenswrapper[4764]: I0320 15:14:04.550544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb4537e-8338-464e-9770-ec2afac2e0c9","Type":"ContainerStarted","Data":"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7"} Mar 20 15:14:04 crc kubenswrapper[4764]: I0320 15:14:04.550580 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb4537e-8338-464e-9770-ec2afac2e0c9","Type":"ContainerStarted","Data":"0d8fa00b97a719f611f7d41161befe65fd77ee6dcdd28f62a28eee5f6d5bbbc6"} Mar 20 15:14:04 crc kubenswrapper[4764]: I0320 15:14:04.551642 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbb122ca-f6cc-46fb-a2f8-f866f4c311da","Type":"ContainerStarted","Data":"b8b82edc4016077ae623e5bfa1723518624833c627a2af4d23edf14db44af472"} Mar 20 15:14:04 crc kubenswrapper[4764]: I0320 15:14:04.931744 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566994-9t58k" Mar 20 15:14:05 crc kubenswrapper[4764]: I0320 15:14:05.038948 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-789gd\" (UniqueName: \"kubernetes.io/projected/0b25e8b0-ce3b-48ff-81df-4589b0ec17ea-kube-api-access-789gd\") pod \"0b25e8b0-ce3b-48ff-81df-4589b0ec17ea\" (UID: \"0b25e8b0-ce3b-48ff-81df-4589b0ec17ea\") " Mar 20 15:14:05 crc kubenswrapper[4764]: I0320 15:14:05.042996 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b25e8b0-ce3b-48ff-81df-4589b0ec17ea-kube-api-access-789gd" (OuterVolumeSpecName: "kube-api-access-789gd") pod "0b25e8b0-ce3b-48ff-81df-4589b0ec17ea" (UID: "0b25e8b0-ce3b-48ff-81df-4589b0ec17ea"). InnerVolumeSpecName "kube-api-access-789gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:05 crc kubenswrapper[4764]: I0320 15:14:05.137287 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d" path="/var/lib/kubelet/pods/aa12ad80-b4a5-4af1-bbe6-1e28e97acb2d/volumes" Mar 20 15:14:05 crc kubenswrapper[4764]: I0320 15:14:05.141032 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-789gd\" (UniqueName: \"kubernetes.io/projected/0b25e8b0-ce3b-48ff-81df-4589b0ec17ea-kube-api-access-789gd\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:05 crc kubenswrapper[4764]: I0320 15:14:05.562842 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbb122ca-f6cc-46fb-a2f8-f866f4c311da","Type":"ContainerStarted","Data":"370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e"} Mar 20 15:14:05 crc kubenswrapper[4764]: I0320 15:14:05.562915 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbb122ca-f6cc-46fb-a2f8-f866f4c311da","Type":"ContainerStarted","Data":"c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093"} Mar 20 15:14:05 crc kubenswrapper[4764]: I0320 15:14:05.567621 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb4537e-8338-464e-9770-ec2afac2e0c9","Type":"ContainerStarted","Data":"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa"} Mar 20 15:14:05 crc kubenswrapper[4764]: I0320 15:14:05.573195 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566994-9t58k" event={"ID":"0b25e8b0-ce3b-48ff-81df-4589b0ec17ea","Type":"ContainerDied","Data":"d4b5cf223168178f167fe60d552f456eaffe7701a55b1e88750143fdbd9b4f0b"} Mar 20 15:14:05 crc kubenswrapper[4764]: I0320 15:14:05.573250 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4b5cf223168178f167fe60d552f456eaffe7701a55b1e88750143fdbd9b4f0b" Mar 20 15:14:05 crc kubenswrapper[4764]: I0320 15:14:05.573255 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566994-9t58k" Mar 20 15:14:05 crc kubenswrapper[4764]: I0320 15:14:05.588014 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.587998403 podStartE2EDuration="2.587998403s" podCreationTimestamp="2026-03-20 15:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:14:05.582720332 +0000 UTC m=+1367.198909461" watchObservedRunningTime="2026-03-20 15:14:05.587998403 +0000 UTC m=+1367.204187532" Mar 20 15:14:05 crc kubenswrapper[4764]: I0320 15:14:05.998569 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566988-bxvzm"] Mar 20 15:14:06 crc kubenswrapper[4764]: I0320 15:14:06.006043 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566988-bxvzm"] Mar 20 15:14:06 crc kubenswrapper[4764]: I0320 15:14:06.586213 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb4537e-8338-464e-9770-ec2afac2e0c9","Type":"ContainerStarted","Data":"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e"} Mar 20 15:14:07 crc kubenswrapper[4764]: I0320 15:14:07.159279 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c7ef967-1634-40b5-a3eb-972372a02741" path="/var/lib/kubelet/pods/3c7ef967-1634-40b5-a3eb-972372a02741/volumes" Mar 20 15:14:07 crc kubenswrapper[4764]: I0320 15:14:07.178245 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 15:14:07 crc kubenswrapper[4764]: I0320 15:14:07.894079 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 15:14:08 crc kubenswrapper[4764]: I0320 15:14:08.448266 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:14:08 crc kubenswrapper[4764]: I0320 15:14:08.448869 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:14:08 crc kubenswrapper[4764]: I0320 15:14:08.628869 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb4537e-8338-464e-9770-ec2afac2e0c9","Type":"ContainerStarted","Data":"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85"} Mar 20 15:14:08 crc kubenswrapper[4764]: I0320 15:14:08.630887 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:14:08 crc kubenswrapper[4764]: I0320 15:14:08.669247 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.969779333 podStartE2EDuration="6.669224735s" podCreationTimestamp="2026-03-20 15:14:02 +0000 UTC" firstStartedPulling="2026-03-20 15:14:03.643532429 +0000 UTC m=+1365.259721558" lastFinishedPulling="2026-03-20 15:14:08.342977811 +0000 UTC m=+1369.959166960" observedRunningTime="2026-03-20 15:14:08.650629876 +0000 UTC m=+1370.266819045" watchObservedRunningTime="2026-03-20 15:14:08.669224735 +0000 UTC m=+1370.285413874" Mar 20 15:14:08 crc kubenswrapper[4764]: I0320 15:14:08.810552 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 15:14:08 crc kubenswrapper[4764]: I0320 15:14:08.810670 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 15:14:09 crc kubenswrapper[4764]: I0320 15:14:09.830658 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="020b2b74-9e86-4e2f-804c-3c7595dd2899" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:14:09 crc kubenswrapper[4764]: I0320 15:14:09.830656 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="020b2b74-9e86-4e2f-804c-3c7595dd2899" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:14:10 crc kubenswrapper[4764]: I0320 15:14:10.229145 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 15:14:12 crc kubenswrapper[4764]: I0320 15:14:12.178608 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 15:14:12 crc kubenswrapper[4764]: I0320 15:14:12.222599 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 15:14:12 crc kubenswrapper[4764]: I0320 15:14:12.718100 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 15:14:13 crc kubenswrapper[4764]: I0320 15:14:13.984250 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:14:13 crc kubenswrapper[4764]: I0320 15:14:13.984737 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:14:15 crc kubenswrapper[4764]: I0320 15:14:15.025667 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:14:15 crc kubenswrapper[4764]: I0320 15:14:15.066715 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:14:16 crc kubenswrapper[4764]: I0320 15:14:16.810398 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 15:14:16 crc kubenswrapper[4764]: I0320 15:14:16.810468 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 15:14:18 crc kubenswrapper[4764]: I0320 15:14:18.816685 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 15:14:18 crc kubenswrapper[4764]: I0320 15:14:18.821799 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 15:14:18 crc kubenswrapper[4764]: I0320 15:14:18.824515 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 15:14:19 crc kubenswrapper[4764]: I0320 15:14:19.749895 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.694245 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.755666 4764 generic.go:334] "Generic (PLEG): container finished" podID="a17efe33-2090-4201-be2e-0add4be515a7" containerID="ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2" exitCode=137 Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.755729 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.755780 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a17efe33-2090-4201-be2e-0add4be515a7","Type":"ContainerDied","Data":"ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2"} Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.755845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a17efe33-2090-4201-be2e-0add4be515a7","Type":"ContainerDied","Data":"16ebc0abd673a6d450c4f8b6c149d9d93b769abcf48df3feefcf1545c23a67dc"} Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.755876 4764 scope.go:117] "RemoveContainer" containerID="ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2" Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.786588 4764 scope.go:117] "RemoveContainer" containerID="ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2" Mar 20 15:14:20 crc kubenswrapper[4764]: E0320 15:14:20.787780 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2\": container with ID starting with ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2 not found: ID does not exist" containerID="ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2" Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.787817 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2"} err="failed to get container status \"ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2\": rpc error: code = NotFound desc = could not find container \"ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2\": container with ID starting with ecfc81a59c478939b2f070883b7a713f46588f805e4f45270008fa2eea04a6f2 not found: ID does not exist" Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.854450 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czm2g\" (UniqueName: \"kubernetes.io/projected/a17efe33-2090-4201-be2e-0add4be515a7-kube-api-access-czm2g\") pod \"a17efe33-2090-4201-be2e-0add4be515a7\" (UID: \"a17efe33-2090-4201-be2e-0add4be515a7\") " Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.854559 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a17efe33-2090-4201-be2e-0add4be515a7-combined-ca-bundle\") pod \"a17efe33-2090-4201-be2e-0add4be515a7\" (UID: \"a17efe33-2090-4201-be2e-0add4be515a7\") " Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.854684 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a17efe33-2090-4201-be2e-0add4be515a7-config-data\") pod \"a17efe33-2090-4201-be2e-0add4be515a7\" (UID: \"a17efe33-2090-4201-be2e-0add4be515a7\") " Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.867848 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a17efe33-2090-4201-be2e-0add4be515a7-kube-api-access-czm2g" (OuterVolumeSpecName: "kube-api-access-czm2g") pod "a17efe33-2090-4201-be2e-0add4be515a7" (UID: "a17efe33-2090-4201-be2e-0add4be515a7"). InnerVolumeSpecName "kube-api-access-czm2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.889879 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a17efe33-2090-4201-be2e-0add4be515a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a17efe33-2090-4201-be2e-0add4be515a7" (UID: "a17efe33-2090-4201-be2e-0add4be515a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.904001 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a17efe33-2090-4201-be2e-0add4be515a7-config-data" (OuterVolumeSpecName: "config-data") pod "a17efe33-2090-4201-be2e-0add4be515a7" (UID: "a17efe33-2090-4201-be2e-0add4be515a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.956276 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czm2g\" (UniqueName: \"kubernetes.io/projected/a17efe33-2090-4201-be2e-0add4be515a7-kube-api-access-czm2g\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.956302 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a17efe33-2090-4201-be2e-0add4be515a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:20 crc kubenswrapper[4764]: I0320 15:14:20.956311 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a17efe33-2090-4201-be2e-0add4be515a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.097566 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.115601 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.137960 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a17efe33-2090-4201-be2e-0add4be515a7" path="/var/lib/kubelet/pods/a17efe33-2090-4201-be2e-0add4be515a7/volumes" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.138947 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:14:21 crc kubenswrapper[4764]: E0320 15:14:21.139463 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a17efe33-2090-4201-be2e-0add4be515a7" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.139491 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a17efe33-2090-4201-be2e-0add4be515a7" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 15:14:21 crc kubenswrapper[4764]: E0320 15:14:21.139544 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b25e8b0-ce3b-48ff-81df-4589b0ec17ea" containerName="oc" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.139558 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b25e8b0-ce3b-48ff-81df-4589b0ec17ea" containerName="oc" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.139867 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a17efe33-2090-4201-be2e-0add4be515a7" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.139906 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b25e8b0-ce3b-48ff-81df-4589b0ec17ea" containerName="oc" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.142047 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.148468 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.148701 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.148890 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.152859 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.262904 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7cb7f6-9466-48d0-b017-12be46d4f2c6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.263881 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqzmk\" (UniqueName: \"kubernetes.io/projected/af7cb7f6-9466-48d0-b017-12be46d4f2c6-kube-api-access-nqzmk\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.263929 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/af7cb7f6-9466-48d0-b017-12be46d4f2c6-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.264136 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7cb7f6-9466-48d0-b017-12be46d4f2c6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.264335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/af7cb7f6-9466-48d0-b017-12be46d4f2c6-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.365949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7cb7f6-9466-48d0-b017-12be46d4f2c6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.366124 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqzmk\" (UniqueName: \"kubernetes.io/projected/af7cb7f6-9466-48d0-b017-12be46d4f2c6-kube-api-access-nqzmk\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.366179 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/af7cb7f6-9466-48d0-b017-12be46d4f2c6-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.366252 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7cb7f6-9466-48d0-b017-12be46d4f2c6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.366359 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/af7cb7f6-9466-48d0-b017-12be46d4f2c6-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.370525 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/af7cb7f6-9466-48d0-b017-12be46d4f2c6-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.371111 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7cb7f6-9466-48d0-b017-12be46d4f2c6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.371646 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7cb7f6-9466-48d0-b017-12be46d4f2c6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.374014 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/af7cb7f6-9466-48d0-b017-12be46d4f2c6-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.386612 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqzmk\" (UniqueName: \"kubernetes.io/projected/af7cb7f6-9466-48d0-b017-12be46d4f2c6-kube-api-access-nqzmk\") pod \"nova-cell1-novncproxy-0\" (UID: \"af7cb7f6-9466-48d0-b017-12be46d4f2c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.473612 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.956430 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.983650 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 15:14:21 crc kubenswrapper[4764]: I0320 15:14:21.984097 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 15:14:22 crc kubenswrapper[4764]: I0320 15:14:22.781177 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"af7cb7f6-9466-48d0-b017-12be46d4f2c6","Type":"ContainerStarted","Data":"49d5559b0e11d66f478bb6bc8baa23517d9347008d16273099752bbefbabb959"} Mar 20 15:14:22 crc kubenswrapper[4764]: I0320 15:14:22.781237 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"af7cb7f6-9466-48d0-b017-12be46d4f2c6","Type":"ContainerStarted","Data":"a11c5085fefe783417eb0c0d828e1002cd33f96d9a62fb59ab44590062f872d7"} Mar 20 15:14:22 crc kubenswrapper[4764]: I0320 15:14:22.813611 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.8135891549999998 podStartE2EDuration="1.813589155s" podCreationTimestamp="2026-03-20 15:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:14:22.806474179 +0000 UTC m=+1384.422663328" watchObservedRunningTime="2026-03-20 15:14:22.813589155 +0000 UTC m=+1384.429778294" Mar 20 15:14:23 crc kubenswrapper[4764]: I0320 15:14:23.987590 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 15:14:23 crc kubenswrapper[4764]: I0320 15:14:23.989623 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 15:14:23 crc kubenswrapper[4764]: I0320 15:14:23.990968 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 15:14:24 crc kubenswrapper[4764]: I0320 15:14:24.803356 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 15:14:24 crc kubenswrapper[4764]: I0320 15:14:24.984575 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-gsbfb"] Mar 20 15:14:24 crc kubenswrapper[4764]: I0320 15:14:24.986069 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.010263 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-gsbfb"] Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.159887 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v7ht\" (UniqueName: \"kubernetes.io/projected/53479118-a3ab-481a-b7f5-8ad3cdc1828e-kube-api-access-7v7ht\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.159975 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.160076 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.160131 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-config\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.160162 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.160192 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.262972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v7ht\" (UniqueName: \"kubernetes.io/projected/53479118-a3ab-481a-b7f5-8ad3cdc1828e-kube-api-access-7v7ht\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.263059 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.263194 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.263225 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-config\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.263269 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.263289 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.264141 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.266432 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.266613 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.266713 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.266790 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-config\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.283748 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v7ht\" (UniqueName: \"kubernetes.io/projected/53479118-a3ab-481a-b7f5-8ad3cdc1828e-kube-api-access-7v7ht\") pod \"dnsmasq-dns-89c5cd4d5-gsbfb\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.324914 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:25 crc kubenswrapper[4764]: I0320 15:14:25.874684 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-gsbfb"] Mar 20 15:14:26 crc kubenswrapper[4764]: I0320 15:14:26.474483 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:26 crc kubenswrapper[4764]: I0320 15:14:26.818263 4764 generic.go:334] "Generic (PLEG): container finished" podID="53479118-a3ab-481a-b7f5-8ad3cdc1828e" containerID="8a2d4b51982499a3c4e3d2694ee3f76551f4f2e113ce40a9180d049f0e6b7331" exitCode=0 Mar 20 15:14:26 crc kubenswrapper[4764]: I0320 15:14:26.818419 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" event={"ID":"53479118-a3ab-481a-b7f5-8ad3cdc1828e","Type":"ContainerDied","Data":"8a2d4b51982499a3c4e3d2694ee3f76551f4f2e113ce40a9180d049f0e6b7331"} Mar 20 15:14:26 crc kubenswrapper[4764]: I0320 15:14:26.818469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" event={"ID":"53479118-a3ab-481a-b7f5-8ad3cdc1828e","Type":"ContainerStarted","Data":"fba15d33437f5d4d3102b9595ee458f8b304ac1360e3c3067da9e79cbded5e38"} Mar 20 15:14:26 crc kubenswrapper[4764]: I0320 15:14:26.821361 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:26 crc kubenswrapper[4764]: I0320 15:14:26.824925 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="ceilometer-central-agent" containerID="cri-o://c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7" gracePeriod=30 Mar 20 15:14:26 crc kubenswrapper[4764]: I0320 15:14:26.825516 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="proxy-httpd" containerID="cri-o://d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85" gracePeriod=30 Mar 20 15:14:26 crc kubenswrapper[4764]: I0320 15:14:26.825744 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="ceilometer-notification-agent" containerID="cri-o://d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa" gracePeriod=30 Mar 20 15:14:26 crc kubenswrapper[4764]: I0320 15:14:26.825814 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="sg-core" containerID="cri-o://846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e" gracePeriod=30 Mar 20 15:14:26 crc kubenswrapper[4764]: I0320 15:14:26.843186 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.208:3000/\": EOF" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.373259 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.824123 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.829368 4764 generic.go:334] "Generic (PLEG): container finished" podID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerID="d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85" exitCode=0 Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.829775 4764 generic.go:334] "Generic (PLEG): container finished" podID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerID="846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e" exitCode=2 Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.829850 4764 generic.go:334] "Generic (PLEG): container finished" podID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerID="d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa" exitCode=0 Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.829484 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb4537e-8338-464e-9770-ec2afac2e0c9","Type":"ContainerDied","Data":"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85"} Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.829970 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb4537e-8338-464e-9770-ec2afac2e0c9","Type":"ContainerDied","Data":"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e"} Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.829989 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb4537e-8338-464e-9770-ec2afac2e0c9","Type":"ContainerDied","Data":"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa"} Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.830003 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb4537e-8338-464e-9770-ec2afac2e0c9","Type":"ContainerDied","Data":"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7"} Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.829448 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.829922 4764 generic.go:334] "Generic (PLEG): container finished" podID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerID="c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7" exitCode=0 Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.830217 4764 scope.go:117] "RemoveContainer" containerID="d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.830243 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bb4537e-8338-464e-9770-ec2afac2e0c9","Type":"ContainerDied","Data":"0d8fa00b97a719f611f7d41161befe65fd77ee6dcdd28f62a28eee5f6d5bbbc6"} Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.833270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" event={"ID":"53479118-a3ab-481a-b7f5-8ad3cdc1828e","Type":"ContainerStarted","Data":"e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a"} Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.833327 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" containerName="nova-api-log" containerID="cri-o://c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093" gracePeriod=30 Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.833474 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" containerName="nova-api-api" containerID="cri-o://370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e" gracePeriod=30 Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.857488 4764 scope.go:117] "RemoveContainer" containerID="846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.894116 4764 scope.go:117] "RemoveContainer" containerID="d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.916215 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh2jt\" (UniqueName: \"kubernetes.io/projected/2bb4537e-8338-464e-9770-ec2afac2e0c9-kube-api-access-qh2jt\") pod \"2bb4537e-8338-464e-9770-ec2afac2e0c9\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.916300 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-ceilometer-tls-certs\") pod \"2bb4537e-8338-464e-9770-ec2afac2e0c9\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.922067 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb4537e-8338-464e-9770-ec2afac2e0c9-kube-api-access-qh2jt" (OuterVolumeSpecName: "kube-api-access-qh2jt") pod "2bb4537e-8338-464e-9770-ec2afac2e0c9" (UID: "2bb4537e-8338-464e-9770-ec2afac2e0c9"). InnerVolumeSpecName "kube-api-access-qh2jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.929286 4764 scope.go:117] "RemoveContainer" containerID="c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.983926 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2bb4537e-8338-464e-9770-ec2afac2e0c9" (UID: "2bb4537e-8338-464e-9770-ec2afac2e0c9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.992343 4764 scope.go:117] "RemoveContainer" containerID="d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85" Mar 20 15:14:27 crc kubenswrapper[4764]: E0320 15:14:27.992777 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85\": container with ID starting with d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85 not found: ID does not exist" containerID="d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.992818 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85"} err="failed to get container status \"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85\": rpc error: code = NotFound desc = could not find container \"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85\": container with ID starting with d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85 not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.992844 4764 scope.go:117] "RemoveContainer" containerID="846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e" Mar 20 15:14:27 crc kubenswrapper[4764]: E0320 15:14:27.993457 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e\": container with ID starting with 846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e not found: ID does not exist" containerID="846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.993492 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e"} err="failed to get container status \"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e\": rpc error: code = NotFound desc = could not find container \"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e\": container with ID starting with 846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.993511 4764 scope.go:117] "RemoveContainer" containerID="d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa" Mar 20 15:14:27 crc kubenswrapper[4764]: E0320 15:14:27.993801 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa\": container with ID starting with d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa not found: ID does not exist" containerID="d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.993841 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa"} err="failed to get container status \"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa\": rpc error: code = NotFound desc = could not find container \"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa\": container with ID starting with d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.993869 4764 scope.go:117] "RemoveContainer" containerID="c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7" Mar 20 15:14:27 crc kubenswrapper[4764]: E0320 15:14:27.994361 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7\": container with ID starting with c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7 not found: ID does not exist" containerID="c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.994405 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7"} err="failed to get container status \"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7\": rpc error: code = NotFound desc = could not find container \"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7\": container with ID starting with c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7 not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.994427 4764 scope.go:117] "RemoveContainer" containerID="d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.994883 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85"} err="failed to get container status \"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85\": rpc error: code = NotFound desc = could not find container \"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85\": container with ID starting with d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85 not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.994904 4764 scope.go:117] "RemoveContainer" containerID="846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.995663 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e"} err="failed to get container status \"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e\": rpc error: code = NotFound desc = could not find container \"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e\": container with ID starting with 846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.995683 4764 scope.go:117] "RemoveContainer" containerID="d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.996007 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa"} err="failed to get container status \"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa\": rpc error: code = NotFound desc = could not find container \"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa\": container with ID starting with d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.996053 4764 scope.go:117] "RemoveContainer" containerID="c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.996497 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7"} err="failed to get container status \"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7\": rpc error: code = NotFound desc = could not find container \"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7\": container with ID starting with c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7 not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.996562 4764 scope.go:117] "RemoveContainer" containerID="d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.996830 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85"} err="failed to get container status \"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85\": rpc error: code = NotFound desc = could not find container \"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85\": container with ID starting with d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85 not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.996852 4764 scope.go:117] "RemoveContainer" containerID="846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.997196 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e"} err="failed to get container status \"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e\": rpc error: code = NotFound desc = could not find container \"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e\": container with ID starting with 846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.997223 4764 scope.go:117] "RemoveContainer" containerID="d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.997451 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa"} err="failed to get container status \"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa\": rpc error: code = NotFound desc = could not find container \"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa\": container with ID starting with d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.997595 4764 scope.go:117] "RemoveContainer" containerID="c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.997839 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7"} err="failed to get container status \"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7\": rpc error: code = NotFound desc = could not find container \"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7\": container with ID starting with c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7 not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.997860 4764 scope.go:117] "RemoveContainer" containerID="d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.998700 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85"} err="failed to get container status \"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85\": rpc error: code = NotFound desc = could not find container \"d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85\": container with ID starting with d9d80e511acc489ced3d948521db12d83f365d861763fa4917bdd1e80f74ec85 not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.998748 4764 scope.go:117] "RemoveContainer" containerID="846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.999127 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e"} err="failed to get container status \"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e\": rpc error: code = NotFound desc = could not find container \"846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e\": container with ID starting with 846ba3be7062b25d202cb1a38c7bf0b540fe08d911c396746172f0b68c663c2e not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.999152 4764 scope.go:117] "RemoveContainer" containerID="d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.999445 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa"} err="failed to get container status \"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa\": rpc error: code = NotFound desc = could not find container \"d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa\": container with ID starting with d0121ca3dfb588941338e2df21f7db0d9655a29bb4b67e1357e3d0fc72e575fa not found: ID does not exist" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.999468 4764 scope.go:117] "RemoveContainer" containerID="c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7" Mar 20 15:14:27 crc kubenswrapper[4764]: I0320 15:14:27.999675 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7"} err="failed to get container status \"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7\": rpc error: code = NotFound desc = could not find container \"c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7\": container with ID starting with c33200b3e4e7ea22388d9e49b742f00fca1ce2d030824bc77f09cbcebecdcce7 not found: ID does not exist" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.020611 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-scripts\") pod \"2bb4537e-8338-464e-9770-ec2afac2e0c9\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.020752 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb4537e-8338-464e-9770-ec2afac2e0c9-log-httpd\") pod \"2bb4537e-8338-464e-9770-ec2afac2e0c9\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.020909 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-combined-ca-bundle\") pod \"2bb4537e-8338-464e-9770-ec2afac2e0c9\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.021033 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-sg-core-conf-yaml\") pod \"2bb4537e-8338-464e-9770-ec2afac2e0c9\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.021145 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-config-data\") pod \"2bb4537e-8338-464e-9770-ec2afac2e0c9\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.021255 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb4537e-8338-464e-9770-ec2afac2e0c9-run-httpd\") pod \"2bb4537e-8338-464e-9770-ec2afac2e0c9\" (UID: \"2bb4537e-8338-464e-9770-ec2afac2e0c9\") " Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.021152 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb4537e-8338-464e-9770-ec2afac2e0c9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2bb4537e-8338-464e-9770-ec2afac2e0c9" (UID: "2bb4537e-8338-464e-9770-ec2afac2e0c9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.021647 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb4537e-8338-464e-9770-ec2afac2e0c9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2bb4537e-8338-464e-9770-ec2afac2e0c9" (UID: "2bb4537e-8338-464e-9770-ec2afac2e0c9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.022014 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh2jt\" (UniqueName: \"kubernetes.io/projected/2bb4537e-8338-464e-9770-ec2afac2e0c9-kube-api-access-qh2jt\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.022092 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.022160 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb4537e-8338-464e-9770-ec2afac2e0c9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.022237 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bb4537e-8338-464e-9770-ec2afac2e0c9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.023825 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-scripts" (OuterVolumeSpecName: "scripts") pod "2bb4537e-8338-464e-9770-ec2afac2e0c9" (UID: "2bb4537e-8338-464e-9770-ec2afac2e0c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.045317 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2bb4537e-8338-464e-9770-ec2afac2e0c9" (UID: "2bb4537e-8338-464e-9770-ec2afac2e0c9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.112121 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bb4537e-8338-464e-9770-ec2afac2e0c9" (UID: "2bb4537e-8338-464e-9770-ec2afac2e0c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.123881 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.123906 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.123918 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.147565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-config-data" (OuterVolumeSpecName: "config-data") pod "2bb4537e-8338-464e-9770-ec2afac2e0c9" (UID: "2bb4537e-8338-464e-9770-ec2afac2e0c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.225692 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb4537e-8338-464e-9770-ec2afac2e0c9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.463982 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" podStartSLOduration=4.463966039 podStartE2EDuration="4.463966039s" podCreationTimestamp="2026-03-20 15:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:14:27.883589044 +0000 UTC m=+1389.499778163" watchObservedRunningTime="2026-03-20 15:14:28.463966039 +0000 UTC m=+1390.080155168" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.471579 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.514724 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.533042 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:28 crc kubenswrapper[4764]: E0320 15:14:28.533492 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="proxy-httpd" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.533513 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="proxy-httpd" Mar 20 15:14:28 crc kubenswrapper[4764]: E0320 15:14:28.533526 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="ceilometer-central-agent" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.533533 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="ceilometer-central-agent" Mar 20 15:14:28 crc kubenswrapper[4764]: E0320 15:14:28.533546 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="ceilometer-notification-agent" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.533552 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="ceilometer-notification-agent" Mar 20 15:14:28 crc kubenswrapper[4764]: E0320 15:14:28.533564 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="sg-core" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.533570 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="sg-core" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.533746 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="ceilometer-central-agent" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.533770 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="ceilometer-notification-agent" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.533780 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="sg-core" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.533789 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" containerName="proxy-httpd" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.535606 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.538323 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.538468 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.538795 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.544651 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.631565 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-scripts\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.631747 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497cc88c-68ff-408d-8272-6067b3bcaf88-run-httpd\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.631848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-config-data\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.631897 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497cc88c-68ff-408d-8272-6067b3bcaf88-log-httpd\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.631929 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.631987 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.632015 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.632079 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwfdw\" (UniqueName: \"kubernetes.io/projected/497cc88c-68ff-408d-8272-6067b3bcaf88-kube-api-access-xwfdw\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.734119 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwfdw\" (UniqueName: \"kubernetes.io/projected/497cc88c-68ff-408d-8272-6067b3bcaf88-kube-api-access-xwfdw\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.734212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-scripts\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.734251 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497cc88c-68ff-408d-8272-6067b3bcaf88-run-httpd\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.734294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-config-data\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.734329 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497cc88c-68ff-408d-8272-6067b3bcaf88-log-httpd\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.734357 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.734413 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.734431 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.734924 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497cc88c-68ff-408d-8272-6067b3bcaf88-run-httpd\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.735453 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497cc88c-68ff-408d-8272-6067b3bcaf88-log-httpd\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.740439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-scripts\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.740561 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.741146 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.741879 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-config-data\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.748994 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.752854 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwfdw\" (UniqueName: \"kubernetes.io/projected/497cc88c-68ff-408d-8272-6067b3bcaf88-kube-api-access-xwfdw\") pod \"ceilometer-0\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.765518 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.766417 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.847000 4764 generic.go:334] "Generic (PLEG): container finished" podID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" containerID="c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093" exitCode=143 Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.847074 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbb122ca-f6cc-46fb-a2f8-f866f4c311da","Type":"ContainerDied","Data":"c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093"} Mar 20 15:14:28 crc kubenswrapper[4764]: I0320 15:14:28.847269 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:29 crc kubenswrapper[4764]: I0320 15:14:29.142944 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb4537e-8338-464e-9770-ec2afac2e0c9" path="/var/lib/kubelet/pods/2bb4537e-8338-464e-9770-ec2afac2e0c9/volumes" Mar 20 15:14:29 crc kubenswrapper[4764]: I0320 15:14:29.283346 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:29 crc kubenswrapper[4764]: I0320 15:14:29.861510 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497cc88c-68ff-408d-8272-6067b3bcaf88","Type":"ContainerStarted","Data":"6903307f21c0d243550548bc7412f34cf1d6d16f55ca8b5afba29d3775347648"} Mar 20 15:14:30 crc kubenswrapper[4764]: I0320 15:14:30.881148 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497cc88c-68ff-408d-8272-6067b3bcaf88","Type":"ContainerStarted","Data":"46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154"} Mar 20 15:14:30 crc kubenswrapper[4764]: I0320 15:14:30.881546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497cc88c-68ff-408d-8272-6067b3bcaf88","Type":"ContainerStarted","Data":"f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a"} Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.378202 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.474609 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.479738 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-logs\") pod \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.479923 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-combined-ca-bundle\") pod \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.480025 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-config-data\") pod \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.480065 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwv5r\" (UniqueName: \"kubernetes.io/projected/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-kube-api-access-dwv5r\") pod \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\" (UID: \"fbb122ca-f6cc-46fb-a2f8-f866f4c311da\") " Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.481904 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-logs" (OuterVolumeSpecName: "logs") pod "fbb122ca-f6cc-46fb-a2f8-f866f4c311da" (UID: "fbb122ca-f6cc-46fb-a2f8-f866f4c311da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.487625 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-kube-api-access-dwv5r" (OuterVolumeSpecName: "kube-api-access-dwv5r") pod "fbb122ca-f6cc-46fb-a2f8-f866f4c311da" (UID: "fbb122ca-f6cc-46fb-a2f8-f866f4c311da"). InnerVolumeSpecName "kube-api-access-dwv5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.502884 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.515066 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-config-data" (OuterVolumeSpecName: "config-data") pod "fbb122ca-f6cc-46fb-a2f8-f866f4c311da" (UID: "fbb122ca-f6cc-46fb-a2f8-f866f4c311da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.519079 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbb122ca-f6cc-46fb-a2f8-f866f4c311da" (UID: "fbb122ca-f6cc-46fb-a2f8-f866f4c311da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.582555 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.582600 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.582613 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwv5r\" (UniqueName: \"kubernetes.io/projected/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-kube-api-access-dwv5r\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.582627 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbb122ca-f6cc-46fb-a2f8-f866f4c311da-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.892411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497cc88c-68ff-408d-8272-6067b3bcaf88","Type":"ContainerStarted","Data":"d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d"} Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.894135 4764 generic.go:334] "Generic (PLEG): container finished" podID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" containerID="370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e" exitCode=0 Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.894191 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.894252 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbb122ca-f6cc-46fb-a2f8-f866f4c311da","Type":"ContainerDied","Data":"370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e"} Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.894318 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbb122ca-f6cc-46fb-a2f8-f866f4c311da","Type":"ContainerDied","Data":"b8b82edc4016077ae623e5bfa1723518624833c627a2af4d23edf14db44af472"} Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.894340 4764 scope.go:117] "RemoveContainer" containerID="370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.918769 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.928803 4764 scope.go:117] "RemoveContainer" containerID="c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.930094 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.937364 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.953273 4764 scope.go:117] "RemoveContainer" containerID="370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e" Mar 20 15:14:31 crc kubenswrapper[4764]: E0320 15:14:31.953703 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e\": container with ID starting with 370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e not found: ID does not exist" containerID="370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.953738 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e"} err="failed to get container status \"370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e\": rpc error: code = NotFound desc = could not find container \"370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e\": container with ID starting with 370590e029fb30d6e73e71e82565f8991ec2012a2a0ca90ab77bba4c00820a7e not found: ID does not exist" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.953758 4764 scope.go:117] "RemoveContainer" containerID="c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093" Mar 20 15:14:31 crc kubenswrapper[4764]: E0320 15:14:31.954019 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093\": container with ID starting with c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093 not found: ID does not exist" containerID="c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.954039 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093"} err="failed to get container status \"c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093\": rpc error: code = NotFound desc = could not find container \"c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093\": container with ID starting with c6090beabc6ecb7d97fb3f8abd46f434d57f23f475b7d9025322e86dbb08c093 not found: ID does not exist" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.963788 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:31 crc kubenswrapper[4764]: E0320 15:14:31.964125 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" containerName="nova-api-log" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.964140 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" containerName="nova-api-log" Mar 20 15:14:31 crc kubenswrapper[4764]: E0320 15:14:31.964157 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" containerName="nova-api-api" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.964164 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" containerName="nova-api-api" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.964339 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" containerName="nova-api-api" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.964365 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" containerName="nova-api-log" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.965209 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.969375 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.978794 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.978911 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 15:14:31 crc kubenswrapper[4764]: I0320 15:14:31.979092 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.083806 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-9x8xg"] Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.085163 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.088007 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.088194 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.090604 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.090715 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-config-data\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.090743 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-public-tls-certs\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.090780 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ebb38e-c93e-40a8-b2c3-55063698e661-logs\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.090794 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr5lp\" (UniqueName: \"kubernetes.io/projected/65ebb38e-c93e-40a8-b2c3-55063698e661-kube-api-access-nr5lp\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.090821 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-internal-tls-certs\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.096671 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-9x8xg"] Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.192194 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-internal-tls-certs\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.192505 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwscj\" (UniqueName: \"kubernetes.io/projected/a62396d5-6708-4d82-863d-5c4a7613290d-kube-api-access-wwscj\") pod \"nova-cell1-cell-mapping-9x8xg\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.192550 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-9x8xg\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.192582 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.192622 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-scripts\") pod \"nova-cell1-cell-mapping-9x8xg\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.192668 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-config-data\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.192693 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-public-tls-certs\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.192720 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ebb38e-c93e-40a8-b2c3-55063698e661-logs\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.192734 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr5lp\" (UniqueName: \"kubernetes.io/projected/65ebb38e-c93e-40a8-b2c3-55063698e661-kube-api-access-nr5lp\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.192749 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-config-data\") pod \"nova-cell1-cell-mapping-9x8xg\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.193756 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ebb38e-c93e-40a8-b2c3-55063698e661-logs\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.197900 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.197984 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-config-data\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.198849 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-internal-tls-certs\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.211149 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr5lp\" (UniqueName: \"kubernetes.io/projected/65ebb38e-c93e-40a8-b2c3-55063698e661-kube-api-access-nr5lp\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.240388 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-public-tls-certs\") pod \"nova-api-0\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.293973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-scripts\") pod \"nova-cell1-cell-mapping-9x8xg\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.294137 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-config-data\") pod \"nova-cell1-cell-mapping-9x8xg\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.294214 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwscj\" (UniqueName: \"kubernetes.io/projected/a62396d5-6708-4d82-863d-5c4a7613290d-kube-api-access-wwscj\") pod \"nova-cell1-cell-mapping-9x8xg\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.294278 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-9x8xg\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.297852 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-scripts\") pod \"nova-cell1-cell-mapping-9x8xg\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.298199 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.298818 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-config-data\") pod \"nova-cell1-cell-mapping-9x8xg\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.302218 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-9x8xg\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.314343 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwscj\" (UniqueName: \"kubernetes.io/projected/a62396d5-6708-4d82-863d-5c4a7613290d-kube-api-access-wwscj\") pod \"nova-cell1-cell-mapping-9x8xg\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.399851 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.805076 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:32 crc kubenswrapper[4764]: W0320 15:14:32.805216 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65ebb38e_c93e_40a8_b2c3_55063698e661.slice/crio-f457b242992eb532e50bc47d8337b3d67aaa37d81edfd52a8a19a5501ae7c3b4 WatchSource:0}: Error finding container f457b242992eb532e50bc47d8337b3d67aaa37d81edfd52a8a19a5501ae7c3b4: Status 404 returned error can't find the container with id f457b242992eb532e50bc47d8337b3d67aaa37d81edfd52a8a19a5501ae7c3b4 Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.911748 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65ebb38e-c93e-40a8-b2c3-55063698e661","Type":"ContainerStarted","Data":"f457b242992eb532e50bc47d8337b3d67aaa37d81edfd52a8a19a5501ae7c3b4"} Mar 20 15:14:32 crc kubenswrapper[4764]: I0320 15:14:32.937151 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-9x8xg"] Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.136067 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbb122ca-f6cc-46fb-a2f8-f866f4c311da" path="/var/lib/kubelet/pods/fbb122ca-f6cc-46fb-a2f8-f866f4c311da/volumes" Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.923471 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497cc88c-68ff-408d-8272-6067b3bcaf88","Type":"ContainerStarted","Data":"9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2"} Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.923654 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="ceilometer-central-agent" containerID="cri-o://f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a" gracePeriod=30 Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.923941 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.924044 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="sg-core" containerID="cri-o://d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d" gracePeriod=30 Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.924077 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="ceilometer-notification-agent" containerID="cri-o://46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154" gracePeriod=30 Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.924078 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="proxy-httpd" containerID="cri-o://9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2" gracePeriod=30 Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.925588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9x8xg" event={"ID":"a62396d5-6708-4d82-863d-5c4a7613290d","Type":"ContainerStarted","Data":"265f50780edff0cad1f2ae2abc645b4b861d39b8ead249186613c401345e77a1"} Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.925622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9x8xg" event={"ID":"a62396d5-6708-4d82-863d-5c4a7613290d","Type":"ContainerStarted","Data":"6baf8342c4919cec1144bf74d44590e2337568b689167f03e7e1c7c92fed4baf"} Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.928887 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65ebb38e-c93e-40a8-b2c3-55063698e661","Type":"ContainerStarted","Data":"d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e"} Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.928919 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65ebb38e-c93e-40a8-b2c3-55063698e661","Type":"ContainerStarted","Data":"07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4"} Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.944130 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.834862998 podStartE2EDuration="5.944116825s" podCreationTimestamp="2026-03-20 15:14:28 +0000 UTC" firstStartedPulling="2026-03-20 15:14:29.27406191 +0000 UTC m=+1390.890251039" lastFinishedPulling="2026-03-20 15:14:33.383315747 +0000 UTC m=+1394.999504866" observedRunningTime="2026-03-20 15:14:33.941317019 +0000 UTC m=+1395.557506148" watchObservedRunningTime="2026-03-20 15:14:33.944116825 +0000 UTC m=+1395.560305954" Mar 20 15:14:33 crc kubenswrapper[4764]: I0320 15:14:33.972610 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.972589553 podStartE2EDuration="2.972589553s" podCreationTimestamp="2026-03-20 15:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:14:33.970350455 +0000 UTC m=+1395.586539604" watchObservedRunningTime="2026-03-20 15:14:33.972589553 +0000 UTC m=+1395.588778692" Mar 20 15:14:34 crc kubenswrapper[4764]: I0320 15:14:34.000166 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-9x8xg" podStartSLOduration=2.000149015 podStartE2EDuration="2.000149015s" podCreationTimestamp="2026-03-20 15:14:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:14:33.99407983 +0000 UTC m=+1395.610268959" watchObservedRunningTime="2026-03-20 15:14:34.000149015 +0000 UTC m=+1395.616338164" Mar 20 15:14:34 crc kubenswrapper[4764]: I0320 15:14:34.940019 4764 generic.go:334] "Generic (PLEG): container finished" podID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerID="9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2" exitCode=0 Mar 20 15:14:34 crc kubenswrapper[4764]: I0320 15:14:34.940326 4764 generic.go:334] "Generic (PLEG): container finished" podID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerID="d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d" exitCode=2 Mar 20 15:14:34 crc kubenswrapper[4764]: I0320 15:14:34.940091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497cc88c-68ff-408d-8272-6067b3bcaf88","Type":"ContainerDied","Data":"9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2"} Mar 20 15:14:34 crc kubenswrapper[4764]: I0320 15:14:34.940374 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497cc88c-68ff-408d-8272-6067b3bcaf88","Type":"ContainerDied","Data":"d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d"} Mar 20 15:14:34 crc kubenswrapper[4764]: I0320 15:14:34.940404 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497cc88c-68ff-408d-8272-6067b3bcaf88","Type":"ContainerDied","Data":"46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154"} Mar 20 15:14:34 crc kubenswrapper[4764]: I0320 15:14:34.940340 4764 generic.go:334] "Generic (PLEG): container finished" podID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerID="46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154" exitCode=0 Mar 20 15:14:35 crc kubenswrapper[4764]: I0320 15:14:35.327182 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:14:35 crc kubenswrapper[4764]: I0320 15:14:35.413702 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7fnxn"] Mar 20 15:14:35 crc kubenswrapper[4764]: I0320 15:14:35.414064 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" podUID="e1cb5067-4f80-440e-9c1e-c422e012190d" containerName="dnsmasq-dns" containerID="cri-o://5ca67c7488940d22b3e29c7c1ec2d664c3528065a3fbb57b032a20df5efa9e58" gracePeriod=10 Mar 20 15:14:35 crc kubenswrapper[4764]: I0320 15:14:35.948887 4764 generic.go:334] "Generic (PLEG): container finished" podID="e1cb5067-4f80-440e-9c1e-c422e012190d" containerID="5ca67c7488940d22b3e29c7c1ec2d664c3528065a3fbb57b032a20df5efa9e58" exitCode=0 Mar 20 15:14:35 crc kubenswrapper[4764]: I0320 15:14:35.948975 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" event={"ID":"e1cb5067-4f80-440e-9c1e-c422e012190d","Type":"ContainerDied","Data":"5ca67c7488940d22b3e29c7c1ec2d664c3528065a3fbb57b032a20df5efa9e58"} Mar 20 15:14:35 crc kubenswrapper[4764]: I0320 15:14:35.949174 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" event={"ID":"e1cb5067-4f80-440e-9c1e-c422e012190d","Type":"ContainerDied","Data":"e415b701cad069ab8a5ecb24fc0dbd217ce333c67a7a91c8ba16b449bcef54e1"} Mar 20 15:14:35 crc kubenswrapper[4764]: I0320 15:14:35.949189 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e415b701cad069ab8a5ecb24fc0dbd217ce333c67a7a91c8ba16b449bcef54e1" Mar 20 15:14:35 crc kubenswrapper[4764]: I0320 15:14:35.959120 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.075136 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-dns-svc\") pod \"e1cb5067-4f80-440e-9c1e-c422e012190d\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.075215 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-ovsdbserver-nb\") pod \"e1cb5067-4f80-440e-9c1e-c422e012190d\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.075254 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-ovsdbserver-sb\") pod \"e1cb5067-4f80-440e-9c1e-c422e012190d\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.075355 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fjnr\" (UniqueName: \"kubernetes.io/projected/e1cb5067-4f80-440e-9c1e-c422e012190d-kube-api-access-2fjnr\") pod \"e1cb5067-4f80-440e-9c1e-c422e012190d\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.075385 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-dns-swift-storage-0\") pod \"e1cb5067-4f80-440e-9c1e-c422e012190d\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.075422 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-config\") pod \"e1cb5067-4f80-440e-9c1e-c422e012190d\" (UID: \"e1cb5067-4f80-440e-9c1e-c422e012190d\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.093739 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1cb5067-4f80-440e-9c1e-c422e012190d-kube-api-access-2fjnr" (OuterVolumeSpecName: "kube-api-access-2fjnr") pod "e1cb5067-4f80-440e-9c1e-c422e012190d" (UID: "e1cb5067-4f80-440e-9c1e-c422e012190d"). InnerVolumeSpecName "kube-api-access-2fjnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.129995 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1cb5067-4f80-440e-9c1e-c422e012190d" (UID: "e1cb5067-4f80-440e-9c1e-c422e012190d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.136884 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-config" (OuterVolumeSpecName: "config") pod "e1cb5067-4f80-440e-9c1e-c422e012190d" (UID: "e1cb5067-4f80-440e-9c1e-c422e012190d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.139831 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1cb5067-4f80-440e-9c1e-c422e012190d" (UID: "e1cb5067-4f80-440e-9c1e-c422e012190d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.142725 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1cb5067-4f80-440e-9c1e-c422e012190d" (UID: "e1cb5067-4f80-440e-9c1e-c422e012190d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.146698 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e1cb5067-4f80-440e-9c1e-c422e012190d" (UID: "e1cb5067-4f80-440e-9c1e-c422e012190d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.177926 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.177961 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.177972 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.177982 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fjnr\" (UniqueName: \"kubernetes.io/projected/e1cb5067-4f80-440e-9c1e-c422e012190d-kube-api-access-2fjnr\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.177993 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.178001 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1cb5067-4f80-440e-9c1e-c422e012190d-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.768642 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.891184 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-config-data\") pod \"497cc88c-68ff-408d-8272-6067b3bcaf88\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.891326 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497cc88c-68ff-408d-8272-6067b3bcaf88-run-httpd\") pod \"497cc88c-68ff-408d-8272-6067b3bcaf88\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.891395 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-ceilometer-tls-certs\") pod \"497cc88c-68ff-408d-8272-6067b3bcaf88\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.891453 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-scripts\") pod \"497cc88c-68ff-408d-8272-6067b3bcaf88\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.891491 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497cc88c-68ff-408d-8272-6067b3bcaf88-log-httpd\") pod \"497cc88c-68ff-408d-8272-6067b3bcaf88\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.891520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-sg-core-conf-yaml\") pod \"497cc88c-68ff-408d-8272-6067b3bcaf88\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.891596 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-combined-ca-bundle\") pod \"497cc88c-68ff-408d-8272-6067b3bcaf88\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.891634 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwfdw\" (UniqueName: \"kubernetes.io/projected/497cc88c-68ff-408d-8272-6067b3bcaf88-kube-api-access-xwfdw\") pod \"497cc88c-68ff-408d-8272-6067b3bcaf88\" (UID: \"497cc88c-68ff-408d-8272-6067b3bcaf88\") " Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.892281 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/497cc88c-68ff-408d-8272-6067b3bcaf88-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "497cc88c-68ff-408d-8272-6067b3bcaf88" (UID: "497cc88c-68ff-408d-8272-6067b3bcaf88"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.892349 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/497cc88c-68ff-408d-8272-6067b3bcaf88-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "497cc88c-68ff-408d-8272-6067b3bcaf88" (UID: "497cc88c-68ff-408d-8272-6067b3bcaf88"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.894595 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/497cc88c-68ff-408d-8272-6067b3bcaf88-kube-api-access-xwfdw" (OuterVolumeSpecName: "kube-api-access-xwfdw") pod "497cc88c-68ff-408d-8272-6067b3bcaf88" (UID: "497cc88c-68ff-408d-8272-6067b3bcaf88"). InnerVolumeSpecName "kube-api-access-xwfdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.897576 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-scripts" (OuterVolumeSpecName: "scripts") pod "497cc88c-68ff-408d-8272-6067b3bcaf88" (UID: "497cc88c-68ff-408d-8272-6067b3bcaf88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.915365 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "497cc88c-68ff-408d-8272-6067b3bcaf88" (UID: "497cc88c-68ff-408d-8272-6067b3bcaf88"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.941832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "497cc88c-68ff-408d-8272-6067b3bcaf88" (UID: "497cc88c-68ff-408d-8272-6067b3bcaf88"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.960777 4764 generic.go:334] "Generic (PLEG): container finished" podID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerID="f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a" exitCode=0 Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.960887 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.961485 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497cc88c-68ff-408d-8272-6067b3bcaf88","Type":"ContainerDied","Data":"f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a"} Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.961547 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"497cc88c-68ff-408d-8272-6067b3bcaf88","Type":"ContainerDied","Data":"6903307f21c0d243550548bc7412f34cf1d6d16f55ca8b5afba29d3775347648"} Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.961580 4764 scope.go:117] "RemoveContainer" containerID="9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.961646 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.975497 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "497cc88c-68ff-408d-8272-6067b3bcaf88" (UID: "497cc88c-68ff-408d-8272-6067b3bcaf88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.994580 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.994619 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497cc88c-68ff-408d-8272-6067b3bcaf88-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.994630 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.994646 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.994658 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwfdw\" (UniqueName: \"kubernetes.io/projected/497cc88c-68ff-408d-8272-6067b3bcaf88-kube-api-access-xwfdw\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.994668 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/497cc88c-68ff-408d-8272-6067b3bcaf88-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:36 crc kubenswrapper[4764]: I0320 15:14:36.994679 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.009771 4764 scope.go:117] "RemoveContainer" containerID="d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.017684 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-config-data" (OuterVolumeSpecName: "config-data") pod "497cc88c-68ff-408d-8272-6067b3bcaf88" (UID: "497cc88c-68ff-408d-8272-6067b3bcaf88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.025552 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7fnxn"] Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.026005 4764 scope.go:117] "RemoveContainer" containerID="46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.039235 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-7fnxn"] Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.046904 4764 scope.go:117] "RemoveContainer" containerID="f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.076701 4764 scope.go:117] "RemoveContainer" containerID="9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2" Mar 20 15:14:37 crc kubenswrapper[4764]: E0320 15:14:37.077827 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2\": container with ID starting with 9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2 not found: ID does not exist" containerID="9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.077859 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2"} err="failed to get container status \"9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2\": rpc error: code = NotFound desc = could not find container \"9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2\": container with ID starting with 9cc266e306e82997d6368ff36c70ead1e739ebc0d1b28ebffee32dd73ebba2d2 not found: ID does not exist" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.077879 4764 scope.go:117] "RemoveContainer" containerID="d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d" Mar 20 15:14:37 crc kubenswrapper[4764]: E0320 15:14:37.078190 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d\": container with ID starting with d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d not found: ID does not exist" containerID="d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.078248 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d"} err="failed to get container status \"d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d\": rpc error: code = NotFound desc = could not find container \"d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d\": container with ID starting with d4718647fc9cb797b5aca5956e8c32de08eacad3ba8e454e71f1b1bfb5071f9d not found: ID does not exist" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.078282 4764 scope.go:117] "RemoveContainer" containerID="46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154" Mar 20 15:14:37 crc kubenswrapper[4764]: E0320 15:14:37.078646 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154\": container with ID starting with 46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154 not found: ID does not exist" containerID="46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.078680 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154"} err="failed to get container status \"46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154\": rpc error: code = NotFound desc = could not find container \"46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154\": container with ID starting with 46af11db0fbf86fda7c1f7d76b36aaa0fbeeaedc2acc1aec41c1b503406db154 not found: ID does not exist" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.078701 4764 scope.go:117] "RemoveContainer" containerID="f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a" Mar 20 15:14:37 crc kubenswrapper[4764]: E0320 15:14:37.079015 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a\": container with ID starting with f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a not found: ID does not exist" containerID="f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.079042 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a"} err="failed to get container status \"f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a\": rpc error: code = NotFound desc = could not find container \"f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a\": container with ID starting with f9ed87553cc8be20fc45b48917f823d5b4af8b5955e44b64c4df80c66e5f243a not found: ID does not exist" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.096634 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/497cc88c-68ff-408d-8272-6067b3bcaf88-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.136958 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1cb5067-4f80-440e-9c1e-c422e012190d" path="/var/lib/kubelet/pods/e1cb5067-4f80-440e-9c1e-c422e012190d/volumes" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.288122 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.295736 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.304438 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:37 crc kubenswrapper[4764]: E0320 15:14:37.304765 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="sg-core" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.304783 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="sg-core" Mar 20 15:14:37 crc kubenswrapper[4764]: E0320 15:14:37.304797 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1cb5067-4f80-440e-9c1e-c422e012190d" containerName="init" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.304803 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1cb5067-4f80-440e-9c1e-c422e012190d" containerName="init" Mar 20 15:14:37 crc kubenswrapper[4764]: E0320 15:14:37.304814 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="proxy-httpd" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.304820 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="proxy-httpd" Mar 20 15:14:37 crc kubenswrapper[4764]: E0320 15:14:37.304829 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="ceilometer-central-agent" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.304834 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="ceilometer-central-agent" Mar 20 15:14:37 crc kubenswrapper[4764]: E0320 15:14:37.304848 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1cb5067-4f80-440e-9c1e-c422e012190d" containerName="dnsmasq-dns" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.304853 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1cb5067-4f80-440e-9c1e-c422e012190d" containerName="dnsmasq-dns" Mar 20 15:14:37 crc kubenswrapper[4764]: E0320 15:14:37.304869 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="ceilometer-notification-agent" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.304876 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="ceilometer-notification-agent" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.305071 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="ceilometer-central-agent" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.305090 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="sg-core" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.305106 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1cb5067-4f80-440e-9c1e-c422e012190d" containerName="dnsmasq-dns" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.305121 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="ceilometer-notification-agent" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.305138 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" containerName="proxy-httpd" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.306752 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.309865 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.309950 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.310605 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.320682 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.406784 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-scripts\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.406890 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.406911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l52f5\" (UniqueName: \"kubernetes.io/projected/2c43fb18-f22b-4423-8241-a6785a42b6e8-kube-api-access-l52f5\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.407013 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c43fb18-f22b-4423-8241-a6785a42b6e8-run-httpd\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.407145 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-config-data\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.407227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.407299 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c43fb18-f22b-4423-8241-a6785a42b6e8-log-httpd\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.407324 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.508739 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-scripts\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.508803 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.508824 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l52f5\" (UniqueName: \"kubernetes.io/projected/2c43fb18-f22b-4423-8241-a6785a42b6e8-kube-api-access-l52f5\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.508848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c43fb18-f22b-4423-8241-a6785a42b6e8-run-httpd\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.508880 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-config-data\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.508903 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.508931 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c43fb18-f22b-4423-8241-a6785a42b6e8-log-httpd\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.508945 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.510610 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c43fb18-f22b-4423-8241-a6785a42b6e8-log-httpd\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.510616 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c43fb18-f22b-4423-8241-a6785a42b6e8-run-httpd\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.514419 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.514605 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.514693 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-scripts\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.514828 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-config-data\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.514892 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c43fb18-f22b-4423-8241-a6785a42b6e8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.526834 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l52f5\" (UniqueName: \"kubernetes.io/projected/2c43fb18-f22b-4423-8241-a6785a42b6e8-kube-api-access-l52f5\") pod \"ceilometer-0\" (UID: \"2c43fb18-f22b-4423-8241-a6785a42b6e8\") " pod="openstack/ceilometer-0" Mar 20 15:14:37 crc kubenswrapper[4764]: I0320 15:14:37.625788 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 15:14:38 crc kubenswrapper[4764]: W0320 15:14:38.084071 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c43fb18_f22b_4423_8241_a6785a42b6e8.slice/crio-90ec776888982d07281d0c281f30d807400564ced18267eced2e53678a2194c3 WatchSource:0}: Error finding container 90ec776888982d07281d0c281f30d807400564ced18267eced2e53678a2194c3: Status 404 returned error can't find the container with id 90ec776888982d07281d0c281f30d807400564ced18267eced2e53678a2194c3 Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.089617 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.443744 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.444291 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.444480 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.445319 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12d0a96258b093aee4f40f6af8a6aca80a1ed347e605a2693dc0a396877cb9c2"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.450662 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://12d0a96258b093aee4f40f6af8a6aca80a1ed347e605a2693dc0a396877cb9c2" gracePeriod=600 Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.981794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c43fb18-f22b-4423-8241-a6785a42b6e8","Type":"ContainerStarted","Data":"a994e00eb682c71a4100cf758be72e60c4fbb0e3f1b9b0956e83ab6abdc587e3"} Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.982197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c43fb18-f22b-4423-8241-a6785a42b6e8","Type":"ContainerStarted","Data":"90ec776888982d07281d0c281f30d807400564ced18267eced2e53678a2194c3"} Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.983891 4764 generic.go:334] "Generic (PLEG): container finished" podID="a62396d5-6708-4d82-863d-5c4a7613290d" containerID="265f50780edff0cad1f2ae2abc645b4b861d39b8ead249186613c401345e77a1" exitCode=0 Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.983988 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9x8xg" event={"ID":"a62396d5-6708-4d82-863d-5c4a7613290d","Type":"ContainerDied","Data":"265f50780edff0cad1f2ae2abc645b4b861d39b8ead249186613c401345e77a1"} Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.994811 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="12d0a96258b093aee4f40f6af8a6aca80a1ed347e605a2693dc0a396877cb9c2" exitCode=0 Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.994850 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"12d0a96258b093aee4f40f6af8a6aca80a1ed347e605a2693dc0a396877cb9c2"} Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.994872 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be"} Mar 20 15:14:38 crc kubenswrapper[4764]: I0320 15:14:38.994893 4764 scope.go:117] "RemoveContainer" containerID="474d025340a960c22301e41eab332b831f75f8273d6153efd902506c422faa11" Mar 20 15:14:39 crc kubenswrapper[4764]: I0320 15:14:39.144481 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="497cc88c-68ff-408d-8272-6067b3bcaf88" path="/var/lib/kubelet/pods/497cc88c-68ff-408d-8272-6067b3bcaf88/volumes" Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.029031 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c43fb18-f22b-4423-8241-a6785a42b6e8","Type":"ContainerStarted","Data":"c8b9b9e0f0b4bd3f33a561c697cb0e05e416694f87fb7367de8d9bf732d303bb"} Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.389467 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.460949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-config-data\") pod \"a62396d5-6708-4d82-863d-5c4a7613290d\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.461007 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-scripts\") pod \"a62396d5-6708-4d82-863d-5c4a7613290d\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.461082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-combined-ca-bundle\") pod \"a62396d5-6708-4d82-863d-5c4a7613290d\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.461154 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwscj\" (UniqueName: \"kubernetes.io/projected/a62396d5-6708-4d82-863d-5c4a7613290d-kube-api-access-wwscj\") pod \"a62396d5-6708-4d82-863d-5c4a7613290d\" (UID: \"a62396d5-6708-4d82-863d-5c4a7613290d\") " Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.469362 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a62396d5-6708-4d82-863d-5c4a7613290d-kube-api-access-wwscj" (OuterVolumeSpecName: "kube-api-access-wwscj") pod "a62396d5-6708-4d82-863d-5c4a7613290d" (UID: "a62396d5-6708-4d82-863d-5c4a7613290d"). InnerVolumeSpecName "kube-api-access-wwscj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.469726 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-scripts" (OuterVolumeSpecName: "scripts") pod "a62396d5-6708-4d82-863d-5c4a7613290d" (UID: "a62396d5-6708-4d82-863d-5c4a7613290d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.492565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-config-data" (OuterVolumeSpecName: "config-data") pod "a62396d5-6708-4d82-863d-5c4a7613290d" (UID: "a62396d5-6708-4d82-863d-5c4a7613290d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.494571 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a62396d5-6708-4d82-863d-5c4a7613290d" (UID: "a62396d5-6708-4d82-863d-5c4a7613290d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.562847 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.562875 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.562883 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62396d5-6708-4d82-863d-5c4a7613290d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.562896 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwscj\" (UniqueName: \"kubernetes.io/projected/a62396d5-6708-4d82-863d-5c4a7613290d-kube-api-access-wwscj\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:40 crc kubenswrapper[4764]: I0320 15:14:40.846749 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-757b4f8459-7fnxn" podUID="e1cb5067-4f80-440e-9c1e-c422e012190d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.200:5353: i/o timeout" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.042079 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-9x8xg" event={"ID":"a62396d5-6708-4d82-863d-5c4a7613290d","Type":"ContainerDied","Data":"6baf8342c4919cec1144bf74d44590e2337568b689167f03e7e1c7c92fed4baf"} Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.042246 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6baf8342c4919cec1144bf74d44590e2337568b689167f03e7e1c7c92fed4baf" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.042141 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-9x8xg" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.044411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c43fb18-f22b-4423-8241-a6785a42b6e8","Type":"ContainerStarted","Data":"1ba08a0e0e8f4fa48facbda04d1d0929bcd697b09c7dc5ab2b2cc90f6fe7ed9a"} Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.106001 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.106248 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="65ebb38e-c93e-40a8-b2c3-55063698e661" containerName="nova-api-log" containerID="cri-o://07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4" gracePeriod=30 Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.106377 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="65ebb38e-c93e-40a8-b2c3-55063698e661" containerName="nova-api-api" containerID="cri-o://d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e" gracePeriod=30 Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.139847 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.140179 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2d1ead43-f1f4-41ba-8de0-bf1c1f386272" containerName="nova-scheduler-scheduler" containerID="cri-o://8adadf8afe223d4f284ab38ce63b2241b697f0a81195f348367b20dee2e7bdd4" gracePeriod=30 Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.144743 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.144962 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="020b2b74-9e86-4e2f-804c-3c7595dd2899" containerName="nova-metadata-log" containerID="cri-o://781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7" gracePeriod=30 Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.145349 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="020b2b74-9e86-4e2f-804c-3c7595dd2899" containerName="nova-metadata-metadata" containerID="cri-o://615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e" gracePeriod=30 Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.673335 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.786074 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-public-tls-certs\") pod \"65ebb38e-c93e-40a8-b2c3-55063698e661\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.786131 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-combined-ca-bundle\") pod \"65ebb38e-c93e-40a8-b2c3-55063698e661\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.786254 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr5lp\" (UniqueName: \"kubernetes.io/projected/65ebb38e-c93e-40a8-b2c3-55063698e661-kube-api-access-nr5lp\") pod \"65ebb38e-c93e-40a8-b2c3-55063698e661\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.786332 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-internal-tls-certs\") pod \"65ebb38e-c93e-40a8-b2c3-55063698e661\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.786362 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-config-data\") pod \"65ebb38e-c93e-40a8-b2c3-55063698e661\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.786445 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ebb38e-c93e-40a8-b2c3-55063698e661-logs\") pod \"65ebb38e-c93e-40a8-b2c3-55063698e661\" (UID: \"65ebb38e-c93e-40a8-b2c3-55063698e661\") " Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.787090 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ebb38e-c93e-40a8-b2c3-55063698e661-logs" (OuterVolumeSpecName: "logs") pod "65ebb38e-c93e-40a8-b2c3-55063698e661" (UID: "65ebb38e-c93e-40a8-b2c3-55063698e661"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.787285 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65ebb38e-c93e-40a8-b2c3-55063698e661-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.791352 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ebb38e-c93e-40a8-b2c3-55063698e661-kube-api-access-nr5lp" (OuterVolumeSpecName: "kube-api-access-nr5lp") pod "65ebb38e-c93e-40a8-b2c3-55063698e661" (UID: "65ebb38e-c93e-40a8-b2c3-55063698e661"). InnerVolumeSpecName "kube-api-access-nr5lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.818657 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-config-data" (OuterVolumeSpecName: "config-data") pod "65ebb38e-c93e-40a8-b2c3-55063698e661" (UID: "65ebb38e-c93e-40a8-b2c3-55063698e661"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.837505 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65ebb38e-c93e-40a8-b2c3-55063698e661" (UID: "65ebb38e-c93e-40a8-b2c3-55063698e661"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.851362 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "65ebb38e-c93e-40a8-b2c3-55063698e661" (UID: "65ebb38e-c93e-40a8-b2c3-55063698e661"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.854102 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "65ebb38e-c93e-40a8-b2c3-55063698e661" (UID: "65ebb38e-c93e-40a8-b2c3-55063698e661"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.888903 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr5lp\" (UniqueName: \"kubernetes.io/projected/65ebb38e-c93e-40a8-b2c3-55063698e661-kube-api-access-nr5lp\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.889117 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.889179 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.889249 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:41 crc kubenswrapper[4764]: I0320 15:14:41.889301 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ebb38e-c93e-40a8-b2c3-55063698e661-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.053981 4764 generic.go:334] "Generic (PLEG): container finished" podID="65ebb38e-c93e-40a8-b2c3-55063698e661" containerID="d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e" exitCode=0 Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.054553 4764 generic.go:334] "Generic (PLEG): container finished" podID="65ebb38e-c93e-40a8-b2c3-55063698e661" containerID="07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4" exitCode=143 Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.054054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65ebb38e-c93e-40a8-b2c3-55063698e661","Type":"ContainerDied","Data":"d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e"} Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.054038 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.055195 4764 scope.go:117] "RemoveContainer" containerID="d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.055092 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65ebb38e-c93e-40a8-b2c3-55063698e661","Type":"ContainerDied","Data":"07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4"} Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.055411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65ebb38e-c93e-40a8-b2c3-55063698e661","Type":"ContainerDied","Data":"f457b242992eb532e50bc47d8337b3d67aaa37d81edfd52a8a19a5501ae7c3b4"} Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.058075 4764 generic.go:334] "Generic (PLEG): container finished" podID="020b2b74-9e86-4e2f-804c-3c7595dd2899" containerID="781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7" exitCode=143 Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.058131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"020b2b74-9e86-4e2f-804c-3c7595dd2899","Type":"ContainerDied","Data":"781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7"} Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.063091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c43fb18-f22b-4423-8241-a6785a42b6e8","Type":"ContainerStarted","Data":"00b0521ed4ebd4922e1a9e8905673224bf15481cd74e5ce601068b6cb080e0ca"} Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.063283 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.076754 4764 scope.go:117] "RemoveContainer" containerID="07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.089655 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.376235972 podStartE2EDuration="5.0896348s" podCreationTimestamp="2026-03-20 15:14:37 +0000 UTC" firstStartedPulling="2026-03-20 15:14:38.087776902 +0000 UTC m=+1399.703966031" lastFinishedPulling="2026-03-20 15:14:41.80117573 +0000 UTC m=+1403.417364859" observedRunningTime="2026-03-20 15:14:42.087822175 +0000 UTC m=+1403.704011304" watchObservedRunningTime="2026-03-20 15:14:42.0896348 +0000 UTC m=+1403.705823929" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.100826 4764 scope.go:117] "RemoveContainer" containerID="d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e" Mar 20 15:14:42 crc kubenswrapper[4764]: E0320 15:14:42.101269 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e\": container with ID starting with d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e not found: ID does not exist" containerID="d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.101308 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e"} err="failed to get container status \"d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e\": rpc error: code = NotFound desc = could not find container \"d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e\": container with ID starting with d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e not found: ID does not exist" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.101333 4764 scope.go:117] "RemoveContainer" containerID="07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4" Mar 20 15:14:42 crc kubenswrapper[4764]: E0320 15:14:42.101625 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4\": container with ID starting with 07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4 not found: ID does not exist" containerID="07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.101674 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4"} err="failed to get container status \"07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4\": rpc error: code = NotFound desc = could not find container \"07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4\": container with ID starting with 07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4 not found: ID does not exist" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.101702 4764 scope.go:117] "RemoveContainer" containerID="d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.101940 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e"} err="failed to get container status \"d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e\": rpc error: code = NotFound desc = could not find container \"d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e\": container with ID starting with d9fcd99aea8d46669a4c58f2ef7fd20a9e9271f517ed5c2e2ca0851982cbc42e not found: ID does not exist" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.101966 4764 scope.go:117] "RemoveContainer" containerID="07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.102236 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4"} err="failed to get container status \"07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4\": rpc error: code = NotFound desc = could not find container \"07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4\": container with ID starting with 07c6ab9417c4fdd61adb7ca84be83fa0534cfdc626e2b25f9760888e09d3a1b4 not found: ID does not exist" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.116476 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.126594 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.135864 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:42 crc kubenswrapper[4764]: E0320 15:14:42.136219 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ebb38e-c93e-40a8-b2c3-55063698e661" containerName="nova-api-api" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.136236 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ebb38e-c93e-40a8-b2c3-55063698e661" containerName="nova-api-api" Mar 20 15:14:42 crc kubenswrapper[4764]: E0320 15:14:42.136256 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ebb38e-c93e-40a8-b2c3-55063698e661" containerName="nova-api-log" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.136263 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ebb38e-c93e-40a8-b2c3-55063698e661" containerName="nova-api-log" Mar 20 15:14:42 crc kubenswrapper[4764]: E0320 15:14:42.136295 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a62396d5-6708-4d82-863d-5c4a7613290d" containerName="nova-manage" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.136301 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62396d5-6708-4d82-863d-5c4a7613290d" containerName="nova-manage" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.136477 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ebb38e-c93e-40a8-b2c3-55063698e661" containerName="nova-api-log" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.136500 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a62396d5-6708-4d82-863d-5c4a7613290d" containerName="nova-manage" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.136519 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ebb38e-c93e-40a8-b2c3-55063698e661" containerName="nova-api-api" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.137451 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.139764 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.140277 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.140698 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 15:14:42 crc kubenswrapper[4764]: E0320 15:14:42.180242 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8adadf8afe223d4f284ab38ce63b2241b697f0a81195f348367b20dee2e7bdd4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.180326 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:42 crc kubenswrapper[4764]: E0320 15:14:42.183007 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8adadf8afe223d4f284ab38ce63b2241b697f0a81195f348367b20dee2e7bdd4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 15:14:42 crc kubenswrapper[4764]: E0320 15:14:42.184372 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8adadf8afe223d4f284ab38ce63b2241b697f0a81195f348367b20dee2e7bdd4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 15:14:42 crc kubenswrapper[4764]: E0320 15:14:42.184415 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2d1ead43-f1f4-41ba-8de0-bf1c1f386272" containerName="nova-scheduler-scheduler" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.296242 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0956f601-6858-456b-8f63-ea6c5b4aebe1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.296520 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0956f601-6858-456b-8f63-ea6c5b4aebe1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.296648 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0956f601-6858-456b-8f63-ea6c5b4aebe1-logs\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.296786 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0956f601-6858-456b-8f63-ea6c5b4aebe1-public-tls-certs\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.296807 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5d8f\" (UniqueName: \"kubernetes.io/projected/0956f601-6858-456b-8f63-ea6c5b4aebe1-kube-api-access-f5d8f\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.296881 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0956f601-6858-456b-8f63-ea6c5b4aebe1-config-data\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.399034 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0956f601-6858-456b-8f63-ea6c5b4aebe1-logs\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.399094 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0956f601-6858-456b-8f63-ea6c5b4aebe1-public-tls-certs\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.399112 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5d8f\" (UniqueName: \"kubernetes.io/projected/0956f601-6858-456b-8f63-ea6c5b4aebe1-kube-api-access-f5d8f\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.399145 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0956f601-6858-456b-8f63-ea6c5b4aebe1-config-data\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.399216 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0956f601-6858-456b-8f63-ea6c5b4aebe1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.399261 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0956f601-6858-456b-8f63-ea6c5b4aebe1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.399600 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0956f601-6858-456b-8f63-ea6c5b4aebe1-logs\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.403932 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0956f601-6858-456b-8f63-ea6c5b4aebe1-config-data\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.406106 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0956f601-6858-456b-8f63-ea6c5b4aebe1-public-tls-certs\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.406320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0956f601-6858-456b-8f63-ea6c5b4aebe1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.408423 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0956f601-6858-456b-8f63-ea6c5b4aebe1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.421068 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5d8f\" (UniqueName: \"kubernetes.io/projected/0956f601-6858-456b-8f63-ea6c5b4aebe1-kube-api-access-f5d8f\") pod \"nova-api-0\" (UID: \"0956f601-6858-456b-8f63-ea6c5b4aebe1\") " pod="openstack/nova-api-0" Mar 20 15:14:42 crc kubenswrapper[4764]: I0320 15:14:42.451130 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 15:14:43 crc kubenswrapper[4764]: I0320 15:14:43.060797 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 15:14:43 crc kubenswrapper[4764]: W0320 15:14:43.063091 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0956f601_6858_456b_8f63_ea6c5b4aebe1.slice/crio-6f29f196ff8497316247afbe587d948946b195fafe7228216e4d67337f3bf783 WatchSource:0}: Error finding container 6f29f196ff8497316247afbe587d948946b195fafe7228216e4d67337f3bf783: Status 404 returned error can't find the container with id 6f29f196ff8497316247afbe587d948946b195fafe7228216e4d67337f3bf783 Mar 20 15:14:43 crc kubenswrapper[4764]: I0320 15:14:43.073975 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0956f601-6858-456b-8f63-ea6c5b4aebe1","Type":"ContainerStarted","Data":"6f29f196ff8497316247afbe587d948946b195fafe7228216e4d67337f3bf783"} Mar 20 15:14:43 crc kubenswrapper[4764]: I0320 15:14:43.144177 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ebb38e-c93e-40a8-b2c3-55063698e661" path="/var/lib/kubelet/pods/65ebb38e-c93e-40a8-b2c3-55063698e661/volumes" Mar 20 15:14:44 crc kubenswrapper[4764]: I0320 15:14:44.083965 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0956f601-6858-456b-8f63-ea6c5b4aebe1","Type":"ContainerStarted","Data":"69fbca31b1b5ca0f6a3e9744235767507ecd3c5b1a579c7d692fe0ab421cea7c"} Mar 20 15:14:44 crc kubenswrapper[4764]: I0320 15:14:44.084424 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0956f601-6858-456b-8f63-ea6c5b4aebe1","Type":"ContainerStarted","Data":"c01628fd174c0ecc371854b1f698b858865d8c700392dd27b561f6657d666803"} Mar 20 15:14:44 crc kubenswrapper[4764]: I0320 15:14:44.118822 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.118805771 podStartE2EDuration="2.118805771s" podCreationTimestamp="2026-03-20 15:14:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:14:44.11780247 +0000 UTC m=+1405.733991599" watchObservedRunningTime="2026-03-20 15:14:44.118805771 +0000 UTC m=+1405.734994900" Mar 20 15:14:44 crc kubenswrapper[4764]: I0320 15:14:44.800662 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:14:44 crc kubenswrapper[4764]: I0320 15:14:44.968766 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-nova-metadata-tls-certs\") pod \"020b2b74-9e86-4e2f-804c-3c7595dd2899\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " Mar 20 15:14:44 crc kubenswrapper[4764]: I0320 15:14:44.968825 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/020b2b74-9e86-4e2f-804c-3c7595dd2899-logs\") pod \"020b2b74-9e86-4e2f-804c-3c7595dd2899\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " Mar 20 15:14:44 crc kubenswrapper[4764]: I0320 15:14:44.968867 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-config-data\") pod \"020b2b74-9e86-4e2f-804c-3c7595dd2899\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " Mar 20 15:14:44 crc kubenswrapper[4764]: I0320 15:14:44.968971 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh69w\" (UniqueName: \"kubernetes.io/projected/020b2b74-9e86-4e2f-804c-3c7595dd2899-kube-api-access-hh69w\") pod \"020b2b74-9e86-4e2f-804c-3c7595dd2899\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " Mar 20 15:14:44 crc kubenswrapper[4764]: I0320 15:14:44.969124 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-combined-ca-bundle\") pod \"020b2b74-9e86-4e2f-804c-3c7595dd2899\" (UID: \"020b2b74-9e86-4e2f-804c-3c7595dd2899\") " Mar 20 15:14:44 crc kubenswrapper[4764]: I0320 15:14:44.969703 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020b2b74-9e86-4e2f-804c-3c7595dd2899-logs" (OuterVolumeSpecName: "logs") pod "020b2b74-9e86-4e2f-804c-3c7595dd2899" (UID: "020b2b74-9e86-4e2f-804c-3c7595dd2899"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:14:44 crc kubenswrapper[4764]: I0320 15:14:44.976655 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020b2b74-9e86-4e2f-804c-3c7595dd2899-kube-api-access-hh69w" (OuterVolumeSpecName: "kube-api-access-hh69w") pod "020b2b74-9e86-4e2f-804c-3c7595dd2899" (UID: "020b2b74-9e86-4e2f-804c-3c7595dd2899"). InnerVolumeSpecName "kube-api-access-hh69w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.007550 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-config-data" (OuterVolumeSpecName: "config-data") pod "020b2b74-9e86-4e2f-804c-3c7595dd2899" (UID: "020b2b74-9e86-4e2f-804c-3c7595dd2899"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.015621 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "020b2b74-9e86-4e2f-804c-3c7595dd2899" (UID: "020b2b74-9e86-4e2f-804c-3c7595dd2899"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.036024 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "020b2b74-9e86-4e2f-804c-3c7595dd2899" (UID: "020b2b74-9e86-4e2f-804c-3c7595dd2899"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.070609 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/020b2b74-9e86-4e2f-804c-3c7595dd2899-logs\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.070645 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.070656 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh69w\" (UniqueName: \"kubernetes.io/projected/020b2b74-9e86-4e2f-804c-3c7595dd2899-kube-api-access-hh69w\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.070667 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.070677 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/020b2b74-9e86-4e2f-804c-3c7595dd2899-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.094296 4764 generic.go:334] "Generic (PLEG): container finished" podID="020b2b74-9e86-4e2f-804c-3c7595dd2899" containerID="615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e" exitCode=0 Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.094363 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.094420 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"020b2b74-9e86-4e2f-804c-3c7595dd2899","Type":"ContainerDied","Data":"615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e"} Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.094446 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"020b2b74-9e86-4e2f-804c-3c7595dd2899","Type":"ContainerDied","Data":"9974eff12c88d8d14dd42a0a8a97cb85375b07d7e145583349bd5f934cbe3a98"} Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.094462 4764 scope.go:117] "RemoveContainer" containerID="615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.122337 4764 scope.go:117] "RemoveContainer" containerID="781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.141835 4764 scope.go:117] "RemoveContainer" containerID="615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e" Mar 20 15:14:45 crc kubenswrapper[4764]: E0320 15:14:45.142127 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e\": container with ID starting with 615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e not found: ID does not exist" containerID="615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.142162 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e"} err="failed to get container status \"615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e\": rpc error: code = NotFound desc = could not find container \"615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e\": container with ID starting with 615ac40e0467d68af1136c1f8f6fc136ca197e46d8ffe91cdc325e4d5f189d5e not found: ID does not exist" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.142186 4764 scope.go:117] "RemoveContainer" containerID="781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7" Mar 20 15:14:45 crc kubenswrapper[4764]: E0320 15:14:45.142368 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7\": container with ID starting with 781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7 not found: ID does not exist" containerID="781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.142405 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7"} err="failed to get container status \"781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7\": rpc error: code = NotFound desc = could not find container \"781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7\": container with ID starting with 781e33b63021bf7f16ef0f970ae8bcb530b2974c00f395bcd50b76e7d3dfe5d7 not found: ID does not exist" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.150473 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.186660 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.201969 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:14:45 crc kubenswrapper[4764]: E0320 15:14:45.202327 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020b2b74-9e86-4e2f-804c-3c7595dd2899" containerName="nova-metadata-log" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.202341 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="020b2b74-9e86-4e2f-804c-3c7595dd2899" containerName="nova-metadata-log" Mar 20 15:14:45 crc kubenswrapper[4764]: E0320 15:14:45.202361 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020b2b74-9e86-4e2f-804c-3c7595dd2899" containerName="nova-metadata-metadata" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.202367 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="020b2b74-9e86-4e2f-804c-3c7595dd2899" containerName="nova-metadata-metadata" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.202568 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="020b2b74-9e86-4e2f-804c-3c7595dd2899" containerName="nova-metadata-metadata" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.202584 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="020b2b74-9e86-4e2f-804c-3c7595dd2899" containerName="nova-metadata-log" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.204954 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.208983 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.209836 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.212974 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.378234 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daebb9be-71bf-47d6-9e0c-def343511d34-config-data\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.378343 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-645b2\" (UniqueName: \"kubernetes.io/projected/daebb9be-71bf-47d6-9e0c-def343511d34-kube-api-access-645b2\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.378578 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/daebb9be-71bf-47d6-9e0c-def343511d34-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.378657 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daebb9be-71bf-47d6-9e0c-def343511d34-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.378703 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daebb9be-71bf-47d6-9e0c-def343511d34-logs\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.480760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-645b2\" (UniqueName: \"kubernetes.io/projected/daebb9be-71bf-47d6-9e0c-def343511d34-kube-api-access-645b2\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.480913 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/daebb9be-71bf-47d6-9e0c-def343511d34-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.480953 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daebb9be-71bf-47d6-9e0c-def343511d34-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.480975 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daebb9be-71bf-47d6-9e0c-def343511d34-logs\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.481668 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/daebb9be-71bf-47d6-9e0c-def343511d34-logs\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.481720 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daebb9be-71bf-47d6-9e0c-def343511d34-config-data\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.492609 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daebb9be-71bf-47d6-9e0c-def343511d34-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.492784 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daebb9be-71bf-47d6-9e0c-def343511d34-config-data\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.492868 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/daebb9be-71bf-47d6-9e0c-def343511d34-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.499479 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-645b2\" (UniqueName: \"kubernetes.io/projected/daebb9be-71bf-47d6-9e0c-def343511d34-kube-api-access-645b2\") pod \"nova-metadata-0\" (UID: \"daebb9be-71bf-47d6-9e0c-def343511d34\") " pod="openstack/nova-metadata-0" Mar 20 15:14:45 crc kubenswrapper[4764]: I0320 15:14:45.539700 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.052961 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.127161 4764 generic.go:334] "Generic (PLEG): container finished" podID="2d1ead43-f1f4-41ba-8de0-bf1c1f386272" containerID="8adadf8afe223d4f284ab38ce63b2241b697f0a81195f348367b20dee2e7bdd4" exitCode=0 Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.127250 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2d1ead43-f1f4-41ba-8de0-bf1c1f386272","Type":"ContainerDied","Data":"8adadf8afe223d4f284ab38ce63b2241b697f0a81195f348367b20dee2e7bdd4"} Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.496137 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.604185 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j94xg\" (UniqueName: \"kubernetes.io/projected/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-kube-api-access-j94xg\") pod \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\" (UID: \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\") " Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.604246 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-combined-ca-bundle\") pod \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\" (UID: \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\") " Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.604399 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-config-data\") pod \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\" (UID: \"2d1ead43-f1f4-41ba-8de0-bf1c1f386272\") " Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.611546 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-kube-api-access-j94xg" (OuterVolumeSpecName: "kube-api-access-j94xg") pod "2d1ead43-f1f4-41ba-8de0-bf1c1f386272" (UID: "2d1ead43-f1f4-41ba-8de0-bf1c1f386272"). InnerVolumeSpecName "kube-api-access-j94xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.632943 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d1ead43-f1f4-41ba-8de0-bf1c1f386272" (UID: "2d1ead43-f1f4-41ba-8de0-bf1c1f386272"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.643808 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-config-data" (OuterVolumeSpecName: "config-data") pod "2d1ead43-f1f4-41ba-8de0-bf1c1f386272" (UID: "2d1ead43-f1f4-41ba-8de0-bf1c1f386272"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.706241 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j94xg\" (UniqueName: \"kubernetes.io/projected/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-kube-api-access-j94xg\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.706271 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:46 crc kubenswrapper[4764]: I0320 15:14:46.706280 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d1ead43-f1f4-41ba-8de0-bf1c1f386272-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.135779 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020b2b74-9e86-4e2f-804c-3c7595dd2899" path="/var/lib/kubelet/pods/020b2b74-9e86-4e2f-804c-3c7595dd2899/volumes" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.137560 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2d1ead43-f1f4-41ba-8de0-bf1c1f386272","Type":"ContainerDied","Data":"cf46cbe118f6350a6c72d18a1e3b11b76605980944ea2e8ad150395dfcf242c0"} Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.137569 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.137607 4764 scope.go:117] "RemoveContainer" containerID="8adadf8afe223d4f284ab38ce63b2241b697f0a81195f348367b20dee2e7bdd4" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.140616 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"daebb9be-71bf-47d6-9e0c-def343511d34","Type":"ContainerStarted","Data":"cd7b991a5e42a8f065784e9f5dd5878483ec0212efd7bf92632bca5ca038829a"} Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.140679 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"daebb9be-71bf-47d6-9e0c-def343511d34","Type":"ContainerStarted","Data":"841cef40de44b011f52aa57d7276e5f1085bab4ded31da8a68e2cfe90789a74a"} Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.140700 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"daebb9be-71bf-47d6-9e0c-def343511d34","Type":"ContainerStarted","Data":"f879df57dde68fea8d2e345b66a8d6fdb4a57f76f20aa33377aff441a2a2b399"} Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.177633 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.177613528 podStartE2EDuration="2.177613528s" podCreationTimestamp="2026-03-20 15:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:14:47.156617216 +0000 UTC m=+1408.772806365" watchObservedRunningTime="2026-03-20 15:14:47.177613528 +0000 UTC m=+1408.793802667" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.183627 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.195193 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.203093 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:14:47 crc kubenswrapper[4764]: E0320 15:14:47.203591 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d1ead43-f1f4-41ba-8de0-bf1c1f386272" containerName="nova-scheduler-scheduler" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.203616 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1ead43-f1f4-41ba-8de0-bf1c1f386272" containerName="nova-scheduler-scheduler" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.203923 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d1ead43-f1f4-41ba-8de0-bf1c1f386272" containerName="nova-scheduler-scheduler" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.204723 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.206660 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.219404 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.322801 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2mmm\" (UniqueName: \"kubernetes.io/projected/bf67384d-ae1b-4966-95af-d73c0e45d7a1-kube-api-access-s2mmm\") pod \"nova-scheduler-0\" (UID: \"bf67384d-ae1b-4966-95af-d73c0e45d7a1\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.323054 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf67384d-ae1b-4966-95af-d73c0e45d7a1-config-data\") pod \"nova-scheduler-0\" (UID: \"bf67384d-ae1b-4966-95af-d73c0e45d7a1\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.323482 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf67384d-ae1b-4966-95af-d73c0e45d7a1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bf67384d-ae1b-4966-95af-d73c0e45d7a1\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.425076 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2mmm\" (UniqueName: \"kubernetes.io/projected/bf67384d-ae1b-4966-95af-d73c0e45d7a1-kube-api-access-s2mmm\") pod \"nova-scheduler-0\" (UID: \"bf67384d-ae1b-4966-95af-d73c0e45d7a1\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.425152 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf67384d-ae1b-4966-95af-d73c0e45d7a1-config-data\") pod \"nova-scheduler-0\" (UID: \"bf67384d-ae1b-4966-95af-d73c0e45d7a1\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.425238 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf67384d-ae1b-4966-95af-d73c0e45d7a1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bf67384d-ae1b-4966-95af-d73c0e45d7a1\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.430496 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf67384d-ae1b-4966-95af-d73c0e45d7a1-config-data\") pod \"nova-scheduler-0\" (UID: \"bf67384d-ae1b-4966-95af-d73c0e45d7a1\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.430517 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf67384d-ae1b-4966-95af-d73c0e45d7a1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bf67384d-ae1b-4966-95af-d73c0e45d7a1\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.452287 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2mmm\" (UniqueName: \"kubernetes.io/projected/bf67384d-ae1b-4966-95af-d73c0e45d7a1-kube-api-access-s2mmm\") pod \"nova-scheduler-0\" (UID: \"bf67384d-ae1b-4966-95af-d73c0e45d7a1\") " pod="openstack/nova-scheduler-0" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.530783 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 15:14:47 crc kubenswrapper[4764]: I0320 15:14:47.976170 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 15:14:47 crc kubenswrapper[4764]: W0320 15:14:47.978688 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf67384d_ae1b_4966_95af_d73c0e45d7a1.slice/crio-066a7eb7e9960581064aab7dc76883a6dc6640a8984f215f3c1a940c7ab0ff07 WatchSource:0}: Error finding container 066a7eb7e9960581064aab7dc76883a6dc6640a8984f215f3c1a940c7ab0ff07: Status 404 returned error can't find the container with id 066a7eb7e9960581064aab7dc76883a6dc6640a8984f215f3c1a940c7ab0ff07 Mar 20 15:14:48 crc kubenswrapper[4764]: I0320 15:14:48.151574 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf67384d-ae1b-4966-95af-d73c0e45d7a1","Type":"ContainerStarted","Data":"066a7eb7e9960581064aab7dc76883a6dc6640a8984f215f3c1a940c7ab0ff07"} Mar 20 15:14:49 crc kubenswrapper[4764]: I0320 15:14:49.162344 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d1ead43-f1f4-41ba-8de0-bf1c1f386272" path="/var/lib/kubelet/pods/2d1ead43-f1f4-41ba-8de0-bf1c1f386272/volumes" Mar 20 15:14:49 crc kubenswrapper[4764]: I0320 15:14:49.185105 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bf67384d-ae1b-4966-95af-d73c0e45d7a1","Type":"ContainerStarted","Data":"c061c98775b587465348a4845d3ceb8bef0c9e6d6107570080a67728b19d047d"} Mar 20 15:14:49 crc kubenswrapper[4764]: I0320 15:14:49.217710 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.217686472 podStartE2EDuration="2.217686472s" podCreationTimestamp="2026-03-20 15:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:14:49.204491279 +0000 UTC m=+1410.820680448" watchObservedRunningTime="2026-03-20 15:14:49.217686472 +0000 UTC m=+1410.833875611" Mar 20 15:14:52 crc kubenswrapper[4764]: I0320 15:14:52.451813 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:14:52 crc kubenswrapper[4764]: I0320 15:14:52.452224 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 15:14:52 crc kubenswrapper[4764]: I0320 15:14:52.531974 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 15:14:53 crc kubenswrapper[4764]: I0320 15:14:53.464601 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0956f601-6858-456b-8f63-ea6c5b4aebe1" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:14:53 crc kubenswrapper[4764]: I0320 15:14:53.464636 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0956f601-6858-456b-8f63-ea6c5b4aebe1" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:14:55 crc kubenswrapper[4764]: I0320 15:14:55.540661 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 15:14:55 crc kubenswrapper[4764]: I0320 15:14:55.540973 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 15:14:56 crc kubenswrapper[4764]: I0320 15:14:56.559710 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="daebb9be-71bf-47d6-9e0c-def343511d34" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 15:14:56 crc kubenswrapper[4764]: I0320 15:14:56.559710 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="daebb9be-71bf-47d6-9e0c-def343511d34" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 15:14:57 crc kubenswrapper[4764]: I0320 15:14:57.531459 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 15:14:57 crc kubenswrapper[4764]: I0320 15:14:57.561189 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 15:14:58 crc kubenswrapper[4764]: I0320 15:14:58.303324 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.159169 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh"] Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.160890 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.163346 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.163364 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.167706 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh"] Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.176055 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/971cebcf-ddc9-443e-99aa-6e110026e383-secret-volume\") pod \"collect-profiles-29566995-pwdxh\" (UID: \"971cebcf-ddc9-443e-99aa-6e110026e383\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.176176 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7jrz\" (UniqueName: \"kubernetes.io/projected/971cebcf-ddc9-443e-99aa-6e110026e383-kube-api-access-m7jrz\") pod \"collect-profiles-29566995-pwdxh\" (UID: \"971cebcf-ddc9-443e-99aa-6e110026e383\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.176289 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/971cebcf-ddc9-443e-99aa-6e110026e383-config-volume\") pod \"collect-profiles-29566995-pwdxh\" (UID: \"971cebcf-ddc9-443e-99aa-6e110026e383\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.279874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/971cebcf-ddc9-443e-99aa-6e110026e383-config-volume\") pod \"collect-profiles-29566995-pwdxh\" (UID: \"971cebcf-ddc9-443e-99aa-6e110026e383\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.279955 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/971cebcf-ddc9-443e-99aa-6e110026e383-secret-volume\") pod \"collect-profiles-29566995-pwdxh\" (UID: \"971cebcf-ddc9-443e-99aa-6e110026e383\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.280017 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7jrz\" (UniqueName: \"kubernetes.io/projected/971cebcf-ddc9-443e-99aa-6e110026e383-kube-api-access-m7jrz\") pod \"collect-profiles-29566995-pwdxh\" (UID: \"971cebcf-ddc9-443e-99aa-6e110026e383\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.280705 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/971cebcf-ddc9-443e-99aa-6e110026e383-config-volume\") pod \"collect-profiles-29566995-pwdxh\" (UID: \"971cebcf-ddc9-443e-99aa-6e110026e383\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.292269 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/971cebcf-ddc9-443e-99aa-6e110026e383-secret-volume\") pod \"collect-profiles-29566995-pwdxh\" (UID: \"971cebcf-ddc9-443e-99aa-6e110026e383\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.307813 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7jrz\" (UniqueName: \"kubernetes.io/projected/971cebcf-ddc9-443e-99aa-6e110026e383-kube-api-access-m7jrz\") pod \"collect-profiles-29566995-pwdxh\" (UID: \"971cebcf-ddc9-443e-99aa-6e110026e383\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.451295 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.451777 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.492156 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:00 crc kubenswrapper[4764]: I0320 15:15:00.994219 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh"] Mar 20 15:15:01 crc kubenswrapper[4764]: I0320 15:15:01.336975 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" event={"ID":"971cebcf-ddc9-443e-99aa-6e110026e383","Type":"ContainerStarted","Data":"76dccbd33eecbed08757eba58f315bbc66e0f8508f7e534e54afd7e588597f0c"} Mar 20 15:15:01 crc kubenswrapper[4764]: I0320 15:15:01.337024 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" event={"ID":"971cebcf-ddc9-443e-99aa-6e110026e383","Type":"ContainerStarted","Data":"12aaa08a8ad96303945272ed13880a4cd8dd3d15589ff1ddce40462a78805762"} Mar 20 15:15:01 crc kubenswrapper[4764]: I0320 15:15:01.359983 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" podStartSLOduration=1.3599617290000001 podStartE2EDuration="1.359961729s" podCreationTimestamp="2026-03-20 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:15:01.354437861 +0000 UTC m=+1422.970627070" watchObservedRunningTime="2026-03-20 15:15:01.359961729 +0000 UTC m=+1422.976150878" Mar 20 15:15:02 crc kubenswrapper[4764]: I0320 15:15:02.349710 4764 generic.go:334] "Generic (PLEG): container finished" podID="971cebcf-ddc9-443e-99aa-6e110026e383" containerID="76dccbd33eecbed08757eba58f315bbc66e0f8508f7e534e54afd7e588597f0c" exitCode=0 Mar 20 15:15:02 crc kubenswrapper[4764]: I0320 15:15:02.350119 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" event={"ID":"971cebcf-ddc9-443e-99aa-6e110026e383","Type":"ContainerDied","Data":"76dccbd33eecbed08757eba58f315bbc66e0f8508f7e534e54afd7e588597f0c"} Mar 20 15:15:02 crc kubenswrapper[4764]: I0320 15:15:02.459318 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 15:15:02 crc kubenswrapper[4764]: I0320 15:15:02.460711 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 15:15:02 crc kubenswrapper[4764]: I0320 15:15:02.471446 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 15:15:03 crc kubenswrapper[4764]: I0320 15:15:03.375651 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 15:15:03 crc kubenswrapper[4764]: I0320 15:15:03.540262 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 15:15:03 crc kubenswrapper[4764]: I0320 15:15:03.540326 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 15:15:03 crc kubenswrapper[4764]: I0320 15:15:03.799499 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:03 crc kubenswrapper[4764]: I0320 15:15:03.959846 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/971cebcf-ddc9-443e-99aa-6e110026e383-config-volume\") pod \"971cebcf-ddc9-443e-99aa-6e110026e383\" (UID: \"971cebcf-ddc9-443e-99aa-6e110026e383\") " Mar 20 15:15:03 crc kubenswrapper[4764]: I0320 15:15:03.960043 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7jrz\" (UniqueName: \"kubernetes.io/projected/971cebcf-ddc9-443e-99aa-6e110026e383-kube-api-access-m7jrz\") pod \"971cebcf-ddc9-443e-99aa-6e110026e383\" (UID: \"971cebcf-ddc9-443e-99aa-6e110026e383\") " Mar 20 15:15:03 crc kubenswrapper[4764]: I0320 15:15:03.960290 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/971cebcf-ddc9-443e-99aa-6e110026e383-secret-volume\") pod \"971cebcf-ddc9-443e-99aa-6e110026e383\" (UID: \"971cebcf-ddc9-443e-99aa-6e110026e383\") " Mar 20 15:15:03 crc kubenswrapper[4764]: I0320 15:15:03.960581 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/971cebcf-ddc9-443e-99aa-6e110026e383-config-volume" (OuterVolumeSpecName: "config-volume") pod "971cebcf-ddc9-443e-99aa-6e110026e383" (UID: "971cebcf-ddc9-443e-99aa-6e110026e383"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:03 crc kubenswrapper[4764]: I0320 15:15:03.961085 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/971cebcf-ddc9-443e-99aa-6e110026e383-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:03 crc kubenswrapper[4764]: I0320 15:15:03.965824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971cebcf-ddc9-443e-99aa-6e110026e383-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "971cebcf-ddc9-443e-99aa-6e110026e383" (UID: "971cebcf-ddc9-443e-99aa-6e110026e383"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:15:03 crc kubenswrapper[4764]: I0320 15:15:03.966554 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/971cebcf-ddc9-443e-99aa-6e110026e383-kube-api-access-m7jrz" (OuterVolumeSpecName: "kube-api-access-m7jrz") pod "971cebcf-ddc9-443e-99aa-6e110026e383" (UID: "971cebcf-ddc9-443e-99aa-6e110026e383"). InnerVolumeSpecName "kube-api-access-m7jrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:15:04 crc kubenswrapper[4764]: I0320 15:15:04.062804 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7jrz\" (UniqueName: \"kubernetes.io/projected/971cebcf-ddc9-443e-99aa-6e110026e383-kube-api-access-m7jrz\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:04 crc kubenswrapper[4764]: I0320 15:15:04.062833 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/971cebcf-ddc9-443e-99aa-6e110026e383-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:04 crc kubenswrapper[4764]: I0320 15:15:04.371000 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" Mar 20 15:15:04 crc kubenswrapper[4764]: I0320 15:15:04.371017 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh" event={"ID":"971cebcf-ddc9-443e-99aa-6e110026e383","Type":"ContainerDied","Data":"12aaa08a8ad96303945272ed13880a4cd8dd3d15589ff1ddce40462a78805762"} Mar 20 15:15:04 crc kubenswrapper[4764]: I0320 15:15:04.371091 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12aaa08a8ad96303945272ed13880a4cd8dd3d15589ff1ddce40462a78805762" Mar 20 15:15:05 crc kubenswrapper[4764]: I0320 15:15:05.547257 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 15:15:05 crc kubenswrapper[4764]: I0320 15:15:05.549008 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 15:15:05 crc kubenswrapper[4764]: I0320 15:15:05.558305 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 15:15:05 crc kubenswrapper[4764]: I0320 15:15:05.559345 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 15:15:06 crc kubenswrapper[4764]: I0320 15:15:06.225536 4764 scope.go:117] "RemoveContainer" containerID="b9cc45be1dd166b14b764a39c4a00ef0c78a8b43b1476e3e81b4cf854d2b0758" Mar 20 15:15:07 crc kubenswrapper[4764]: I0320 15:15:07.634585 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 15:15:16 crc kubenswrapper[4764]: I0320 15:15:16.601254 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:15:17 crc kubenswrapper[4764]: I0320 15:15:17.607217 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:15:20 crc kubenswrapper[4764]: I0320 15:15:20.498663 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="ef1499d4-3bae-40c1-882d-ad9778b9eb80" containerName="rabbitmq" containerID="cri-o://188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5" gracePeriod=604797 Mar 20 15:15:21 crc kubenswrapper[4764]: I0320 15:15:21.712453 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="ef1499d4-3bae-40c1-882d-ad9778b9eb80" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 20 15:15:21 crc kubenswrapper[4764]: I0320 15:15:21.893174 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b497b447-0f6a-47e6-b106-16ca68b88d44" containerName="rabbitmq" containerID="cri-o://bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4" gracePeriod=604796 Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.118650 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.214642 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dxbk\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-kube-api-access-2dxbk\") pod \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.214707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-plugins-conf\") pod \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.214812 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.214873 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef1499d4-3bae-40c1-882d-ad9778b9eb80-erlang-cookie-secret\") pod \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.215583 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ef1499d4-3bae-40c1-882d-ad9778b9eb80" (UID: "ef1499d4-3bae-40c1-882d-ad9778b9eb80"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.215619 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-confd\") pod \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.215707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef1499d4-3bae-40c1-882d-ad9778b9eb80-pod-info\") pod \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.215761 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-erlang-cookie\") pod \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.215805 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-tls\") pod \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.215838 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-config-data\") pod \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.215884 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-plugins\") pod \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.215921 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-server-conf\") pod \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\" (UID: \"ef1499d4-3bae-40c1-882d-ad9778b9eb80\") " Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.217351 4764 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.225952 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ef1499d4-3bae-40c1-882d-ad9778b9eb80" (UID: "ef1499d4-3bae-40c1-882d-ad9778b9eb80"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.226101 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ef1499d4-3bae-40c1-882d-ad9778b9eb80" (UID: "ef1499d4-3bae-40c1-882d-ad9778b9eb80"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.237180 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-kube-api-access-2dxbk" (OuterVolumeSpecName: "kube-api-access-2dxbk") pod "ef1499d4-3bae-40c1-882d-ad9778b9eb80" (UID: "ef1499d4-3bae-40c1-882d-ad9778b9eb80"). InnerVolumeSpecName "kube-api-access-2dxbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.261722 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "ef1499d4-3bae-40c1-882d-ad9778b9eb80" (UID: "ef1499d4-3bae-40c1-882d-ad9778b9eb80"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.261732 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ef1499d4-3bae-40c1-882d-ad9778b9eb80-pod-info" (OuterVolumeSpecName: "pod-info") pod "ef1499d4-3bae-40c1-882d-ad9778b9eb80" (UID: "ef1499d4-3bae-40c1-882d-ad9778b9eb80"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.261822 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ef1499d4-3bae-40c1-882d-ad9778b9eb80" (UID: "ef1499d4-3bae-40c1-882d-ad9778b9eb80"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.266452 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-config-data" (OuterVolumeSpecName: "config-data") pod "ef1499d4-3bae-40c1-882d-ad9778b9eb80" (UID: "ef1499d4-3bae-40c1-882d-ad9778b9eb80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.268882 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1499d4-3bae-40c1-882d-ad9778b9eb80-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ef1499d4-3bae-40c1-882d-ad9778b9eb80" (UID: "ef1499d4-3bae-40c1-882d-ad9778b9eb80"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.307224 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-server-conf" (OuterVolumeSpecName: "server-conf") pod "ef1499d4-3bae-40c1-882d-ad9778b9eb80" (UID: "ef1499d4-3bae-40c1-882d-ad9778b9eb80"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.319441 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.319480 4764 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.319492 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dxbk\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-kube-api-access-2dxbk\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.319522 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.319535 4764 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef1499d4-3bae-40c1-882d-ad9778b9eb80-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.319546 4764 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef1499d4-3bae-40c1-882d-ad9778b9eb80-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.319556 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.319566 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.319579 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1499d4-3bae-40c1-882d-ad9778b9eb80-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.335159 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ef1499d4-3bae-40c1-882d-ad9778b9eb80" (UID: "ef1499d4-3bae-40c1-882d-ad9778b9eb80"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.342317 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.421022 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.421057 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef1499d4-3bae-40c1-882d-ad9778b9eb80-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.621589 4764 generic.go:334] "Generic (PLEG): container finished" podID="ef1499d4-3bae-40c1-882d-ad9778b9eb80" containerID="188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5" exitCode=0 Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.621642 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef1499d4-3bae-40c1-882d-ad9778b9eb80","Type":"ContainerDied","Data":"188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5"} Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.621669 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef1499d4-3bae-40c1-882d-ad9778b9eb80","Type":"ContainerDied","Data":"9be30d51b62a9de92cdefbadf886cbd1c223e81fc4a632398508ad1b8784d72b"} Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.621663 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.621684 4764 scope.go:117] "RemoveContainer" containerID="188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.654996 4764 scope.go:117] "RemoveContainer" containerID="9215e43f927a95a4439a55c48e299c5c67930db52f0f8f187ffa9ea723db8651" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.658088 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.669408 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.681168 4764 scope.go:117] "RemoveContainer" containerID="188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5" Mar 20 15:15:27 crc kubenswrapper[4764]: E0320 15:15:27.681597 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5\": container with ID starting with 188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5 not found: ID does not exist" containerID="188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.681655 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5"} err="failed to get container status \"188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5\": rpc error: code = NotFound desc = could not find container \"188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5\": container with ID starting with 188e06fbc04820c8e963aad98e46c4cacbaff180dd614c94c4dd090eca7d63c5 not found: ID does not exist" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.681686 4764 scope.go:117] "RemoveContainer" containerID="9215e43f927a95a4439a55c48e299c5c67930db52f0f8f187ffa9ea723db8651" Mar 20 15:15:27 crc kubenswrapper[4764]: E0320 15:15:27.681980 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9215e43f927a95a4439a55c48e299c5c67930db52f0f8f187ffa9ea723db8651\": container with ID starting with 9215e43f927a95a4439a55c48e299c5c67930db52f0f8f187ffa9ea723db8651 not found: ID does not exist" containerID="9215e43f927a95a4439a55c48e299c5c67930db52f0f8f187ffa9ea723db8651" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.682006 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9215e43f927a95a4439a55c48e299c5c67930db52f0f8f187ffa9ea723db8651"} err="failed to get container status \"9215e43f927a95a4439a55c48e299c5c67930db52f0f8f187ffa9ea723db8651\": rpc error: code = NotFound desc = could not find container \"9215e43f927a95a4439a55c48e299c5c67930db52f0f8f187ffa9ea723db8651\": container with ID starting with 9215e43f927a95a4439a55c48e299c5c67930db52f0f8f187ffa9ea723db8651 not found: ID does not exist" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.700021 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:15:27 crc kubenswrapper[4764]: E0320 15:15:27.700520 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1499d4-3bae-40c1-882d-ad9778b9eb80" containerName="rabbitmq" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.700541 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1499d4-3bae-40c1-882d-ad9778b9eb80" containerName="rabbitmq" Mar 20 15:15:27 crc kubenswrapper[4764]: E0320 15:15:27.700569 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971cebcf-ddc9-443e-99aa-6e110026e383" containerName="collect-profiles" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.700577 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="971cebcf-ddc9-443e-99aa-6e110026e383" containerName="collect-profiles" Mar 20 15:15:27 crc kubenswrapper[4764]: E0320 15:15:27.700616 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1499d4-3bae-40c1-882d-ad9778b9eb80" containerName="setup-container" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.700625 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1499d4-3bae-40c1-882d-ad9778b9eb80" containerName="setup-container" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.700841 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="971cebcf-ddc9-443e-99aa-6e110026e383" containerName="collect-profiles" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.700859 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1499d4-3bae-40c1-882d-ad9778b9eb80" containerName="rabbitmq" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.702034 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.704047 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.704462 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-l5h5r" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.704635 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.705038 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.705209 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.705369 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.705497 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.742005 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.827756 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rntz\" (UniqueName: \"kubernetes.io/projected/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-kube-api-access-9rntz\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.827801 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.827832 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.828066 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.828132 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.828196 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.828309 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.828338 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.828373 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.828484 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.828555 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.929918 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.929979 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.930026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.930065 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.930093 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.930129 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.930155 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.930226 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.930252 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rntz\" (UniqueName: \"kubernetes.io/projected/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-kube-api-access-9rntz\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.930276 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.930308 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.930810 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.931115 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.931134 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.931458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.931712 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.932456 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.938023 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.939285 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.939955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.948716 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.962257 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rntz\" (UniqueName: \"kubernetes.io/projected/c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f-kube-api-access-9rntz\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:27 crc kubenswrapper[4764]: I0320 15:15:27.967799 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f\") " pod="openstack/rabbitmq-server-0" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.080277 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.422057 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.542914 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-erlang-cookie\") pod \"b497b447-0f6a-47e6-b106-16ca68b88d44\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.542956 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"b497b447-0f6a-47e6-b106-16ca68b88d44\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.543012 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-confd\") pod \"b497b447-0f6a-47e6-b106-16ca68b88d44\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.543027 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd4dk\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-kube-api-access-sd4dk\") pod \"b497b447-0f6a-47e6-b106-16ca68b88d44\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.543041 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-tls\") pod \"b497b447-0f6a-47e6-b106-16ca68b88d44\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.543110 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-server-conf\") pod \"b497b447-0f6a-47e6-b106-16ca68b88d44\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.543141 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b497b447-0f6a-47e6-b106-16ca68b88d44-erlang-cookie-secret\") pod \"b497b447-0f6a-47e6-b106-16ca68b88d44\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.543178 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-plugins-conf\") pod \"b497b447-0f6a-47e6-b106-16ca68b88d44\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.543248 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-plugins\") pod \"b497b447-0f6a-47e6-b106-16ca68b88d44\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.543288 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b497b447-0f6a-47e6-b106-16ca68b88d44-pod-info\") pod \"b497b447-0f6a-47e6-b106-16ca68b88d44\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.543327 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-config-data\") pod \"b497b447-0f6a-47e6-b106-16ca68b88d44\" (UID: \"b497b447-0f6a-47e6-b106-16ca68b88d44\") " Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.544062 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b497b447-0f6a-47e6-b106-16ca68b88d44" (UID: "b497b447-0f6a-47e6-b106-16ca68b88d44"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.545173 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b497b447-0f6a-47e6-b106-16ca68b88d44" (UID: "b497b447-0f6a-47e6-b106-16ca68b88d44"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.545577 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b497b447-0f6a-47e6-b106-16ca68b88d44" (UID: "b497b447-0f6a-47e6-b106-16ca68b88d44"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.547722 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b497b447-0f6a-47e6-b106-16ca68b88d44" (UID: "b497b447-0f6a-47e6-b106-16ca68b88d44"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.548502 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "b497b447-0f6a-47e6-b106-16ca68b88d44" (UID: "b497b447-0f6a-47e6-b106-16ca68b88d44"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.548588 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b497b447-0f6a-47e6-b106-16ca68b88d44-pod-info" (OuterVolumeSpecName: "pod-info") pod "b497b447-0f6a-47e6-b106-16ca68b88d44" (UID: "b497b447-0f6a-47e6-b106-16ca68b88d44"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.548886 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b497b447-0f6a-47e6-b106-16ca68b88d44-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b497b447-0f6a-47e6-b106-16ca68b88d44" (UID: "b497b447-0f6a-47e6-b106-16ca68b88d44"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.549436 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-kube-api-access-sd4dk" (OuterVolumeSpecName: "kube-api-access-sd4dk") pod "b497b447-0f6a-47e6-b106-16ca68b88d44" (UID: "b497b447-0f6a-47e6-b106-16ca68b88d44"). InnerVolumeSpecName "kube-api-access-sd4dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.573049 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-config-data" (OuterVolumeSpecName: "config-data") pod "b497b447-0f6a-47e6-b106-16ca68b88d44" (UID: "b497b447-0f6a-47e6-b106-16ca68b88d44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.591898 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-server-conf" (OuterVolumeSpecName: "server-conf") pod "b497b447-0f6a-47e6-b106-16ca68b88d44" (UID: "b497b447-0f6a-47e6-b106-16ca68b88d44"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.635859 4764 generic.go:334] "Generic (PLEG): container finished" podID="b497b447-0f6a-47e6-b106-16ca68b88d44" containerID="bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4" exitCode=0 Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.635935 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b497b447-0f6a-47e6-b106-16ca68b88d44","Type":"ContainerDied","Data":"bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4"} Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.635962 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b497b447-0f6a-47e6-b106-16ca68b88d44","Type":"ContainerDied","Data":"4beb67358a9ecc69ecae0219e0286bb747f91402a76ccfe400d1b9e615da848b"} Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.635968 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.635979 4764 scope.go:117] "RemoveContainer" containerID="bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.641837 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.645358 4764 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.645410 4764 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b497b447-0f6a-47e6-b106-16ca68b88d44-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.645424 4764 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.645438 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.645448 4764 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b497b447-0f6a-47e6-b106-16ca68b88d44-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.645463 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b497b447-0f6a-47e6-b106-16ca68b88d44-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.645476 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.645512 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.645527 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd4dk\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-kube-api-access-sd4dk\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.645540 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.656552 4764 scope.go:117] "RemoveContainer" containerID="78dbb6b14b25554bc8dfbd32d89aa8308edaafd3936db77f6a6353d6ab780c08" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.662005 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b497b447-0f6a-47e6-b106-16ca68b88d44" (UID: "b497b447-0f6a-47e6-b106-16ca68b88d44"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.667458 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.690439 4764 scope.go:117] "RemoveContainer" containerID="bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4" Mar 20 15:15:28 crc kubenswrapper[4764]: E0320 15:15:28.690892 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4\": container with ID starting with bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4 not found: ID does not exist" containerID="bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.690975 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4"} err="failed to get container status \"bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4\": rpc error: code = NotFound desc = could not find container \"bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4\": container with ID starting with bd5ce233536544d630371812fb1b8194ce23249cb840edd52647402650d343d4 not found: ID does not exist" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.691005 4764 scope.go:117] "RemoveContainer" containerID="78dbb6b14b25554bc8dfbd32d89aa8308edaafd3936db77f6a6353d6ab780c08" Mar 20 15:15:28 crc kubenswrapper[4764]: E0320 15:15:28.691846 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78dbb6b14b25554bc8dfbd32d89aa8308edaafd3936db77f6a6353d6ab780c08\": container with ID starting with 78dbb6b14b25554bc8dfbd32d89aa8308edaafd3936db77f6a6353d6ab780c08 not found: ID does not exist" containerID="78dbb6b14b25554bc8dfbd32d89aa8308edaafd3936db77f6a6353d6ab780c08" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.691878 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78dbb6b14b25554bc8dfbd32d89aa8308edaafd3936db77f6a6353d6ab780c08"} err="failed to get container status \"78dbb6b14b25554bc8dfbd32d89aa8308edaafd3936db77f6a6353d6ab780c08\": rpc error: code = NotFound desc = could not find container \"78dbb6b14b25554bc8dfbd32d89aa8308edaafd3936db77f6a6353d6ab780c08\": container with ID starting with 78dbb6b14b25554bc8dfbd32d89aa8308edaafd3936db77f6a6353d6ab780c08 not found: ID does not exist" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.747718 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.747764 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b497b447-0f6a-47e6-b106-16ca68b88d44-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.975330 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:15:28 crc kubenswrapper[4764]: I0320 15:15:28.991709 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.035936 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:15:29 crc kubenswrapper[4764]: E0320 15:15:29.036834 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b497b447-0f6a-47e6-b106-16ca68b88d44" containerName="rabbitmq" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.036860 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b497b447-0f6a-47e6-b106-16ca68b88d44" containerName="rabbitmq" Mar 20 15:15:29 crc kubenswrapper[4764]: E0320 15:15:29.036896 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b497b447-0f6a-47e6-b106-16ca68b88d44" containerName="setup-container" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.036905 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b497b447-0f6a-47e6-b106-16ca68b88d44" containerName="setup-container" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.040731 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b497b447-0f6a-47e6-b106-16ca68b88d44" containerName="rabbitmq" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.041905 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.044280 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.044542 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.044554 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.044639 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.044715 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.044748 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.044825 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jb55x" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.044845 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.136083 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b497b447-0f6a-47e6-b106-16ca68b88d44" path="/var/lib/kubelet/pods/b497b447-0f6a-47e6-b106-16ca68b88d44/volumes" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.136882 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef1499d4-3bae-40c1-882d-ad9778b9eb80" path="/var/lib/kubelet/pods/ef1499d4-3bae-40c1-882d-ad9778b9eb80/volumes" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.160619 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eddb3def-0cd3-4d16-954a-dff2909e681f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.160678 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eddb3def-0cd3-4d16-954a-dff2909e681f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.160705 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eddb3def-0cd3-4d16-954a-dff2909e681f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.160730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eddb3def-0cd3-4d16-954a-dff2909e681f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.160758 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eddb3def-0cd3-4d16-954a-dff2909e681f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.160825 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5g9r\" (UniqueName: \"kubernetes.io/projected/eddb3def-0cd3-4d16-954a-dff2909e681f-kube-api-access-l5g9r\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.160857 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eddb3def-0cd3-4d16-954a-dff2909e681f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.160888 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eddb3def-0cd3-4d16-954a-dff2909e681f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.160944 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.160962 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eddb3def-0cd3-4d16-954a-dff2909e681f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.161011 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eddb3def-0cd3-4d16-954a-dff2909e681f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.262363 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eddb3def-0cd3-4d16-954a-dff2909e681f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.262420 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eddb3def-0cd3-4d16-954a-dff2909e681f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.262451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eddb3def-0cd3-4d16-954a-dff2909e681f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.262487 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eddb3def-0cd3-4d16-954a-dff2909e681f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.262514 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5g9r\" (UniqueName: \"kubernetes.io/projected/eddb3def-0cd3-4d16-954a-dff2909e681f-kube-api-access-l5g9r\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.262537 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eddb3def-0cd3-4d16-954a-dff2909e681f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.262564 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eddb3def-0cd3-4d16-954a-dff2909e681f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.262605 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.262623 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eddb3def-0cd3-4d16-954a-dff2909e681f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.262657 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eddb3def-0cd3-4d16-954a-dff2909e681f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.262693 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eddb3def-0cd3-4d16-954a-dff2909e681f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.262876 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eddb3def-0cd3-4d16-954a-dff2909e681f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.263249 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.263634 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eddb3def-0cd3-4d16-954a-dff2909e681f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.263859 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eddb3def-0cd3-4d16-954a-dff2909e681f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.264231 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eddb3def-0cd3-4d16-954a-dff2909e681f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.264273 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eddb3def-0cd3-4d16-954a-dff2909e681f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.266887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eddb3def-0cd3-4d16-954a-dff2909e681f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.267081 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eddb3def-0cd3-4d16-954a-dff2909e681f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.269431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eddb3def-0cd3-4d16-954a-dff2909e681f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.270050 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eddb3def-0cd3-4d16-954a-dff2909e681f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.284063 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5g9r\" (UniqueName: \"kubernetes.io/projected/eddb3def-0cd3-4d16-954a-dff2909e681f-kube-api-access-l5g9r\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.297180 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"eddb3def-0cd3-4d16-954a-dff2909e681f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.371721 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.650100 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f","Type":"ContainerStarted","Data":"441490ffb4a97606d8a0ffa92b927535c82a34617c39b0cab1b8b09f3e77124a"} Mar 20 15:15:29 crc kubenswrapper[4764]: I0320 15:15:29.815736 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.313200 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-29ltk"] Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.314893 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.317465 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.330533 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-29ltk"] Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.384469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.384509 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.384554 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.384740 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.384943 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47p5h\" (UniqueName: \"kubernetes.io/projected/c81f6ddd-5431-4a84-985b-5b13f91c44d5-kube-api-access-47p5h\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.385128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-config\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.385173 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.487017 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47p5h\" (UniqueName: \"kubernetes.io/projected/c81f6ddd-5431-4a84-985b-5b13f91c44d5-kube-api-access-47p5h\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.487114 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-config\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.487147 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.487238 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.487263 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.487311 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.487359 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.488674 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.488741 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.488981 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.489413 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.489480 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.489896 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-config\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.506006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47p5h\" (UniqueName: \"kubernetes.io/projected/c81f6ddd-5431-4a84-985b-5b13f91c44d5-kube-api-access-47p5h\") pod \"dnsmasq-dns-79bd4cc8c9-29ltk\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.632757 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.660780 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f","Type":"ContainerStarted","Data":"01fe0ed2dd032ce64b877fab7bd786361a36be45bdd855cee07181e2b829fe8c"} Mar 20 15:15:30 crc kubenswrapper[4764]: I0320 15:15:30.663164 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eddb3def-0cd3-4d16-954a-dff2909e681f","Type":"ContainerStarted","Data":"5f06e2cca6d7b1f79125103c699860109cb3d8b9043753eeec5d547b0ce82172"} Mar 20 15:15:31 crc kubenswrapper[4764]: I0320 15:15:31.115286 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-29ltk"] Mar 20 15:15:31 crc kubenswrapper[4764]: W0320 15:15:31.187160 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc81f6ddd_5431_4a84_985b_5b13f91c44d5.slice/crio-a6995c488a7af2eb9390e3157393218fcb740ed0d263e85a0dc06760388e65a7 WatchSource:0}: Error finding container a6995c488a7af2eb9390e3157393218fcb740ed0d263e85a0dc06760388e65a7: Status 404 returned error can't find the container with id a6995c488a7af2eb9390e3157393218fcb740ed0d263e85a0dc06760388e65a7 Mar 20 15:15:31 crc kubenswrapper[4764]: I0320 15:15:31.674306 4764 generic.go:334] "Generic (PLEG): container finished" podID="c81f6ddd-5431-4a84-985b-5b13f91c44d5" containerID="cf968ed2bcf0d078ff98c6bc1b46974849085c25dcd5a5473b73f976fc2e49a4" exitCode=0 Mar 20 15:15:31 crc kubenswrapper[4764]: I0320 15:15:31.674540 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" event={"ID":"c81f6ddd-5431-4a84-985b-5b13f91c44d5","Type":"ContainerDied","Data":"cf968ed2bcf0d078ff98c6bc1b46974849085c25dcd5a5473b73f976fc2e49a4"} Mar 20 15:15:31 crc kubenswrapper[4764]: I0320 15:15:31.675618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" event={"ID":"c81f6ddd-5431-4a84-985b-5b13f91c44d5","Type":"ContainerStarted","Data":"a6995c488a7af2eb9390e3157393218fcb740ed0d263e85a0dc06760388e65a7"} Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.249439 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bbwsn"] Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.254420 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.271961 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bbwsn"] Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.429092 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpg2w\" (UniqueName: \"kubernetes.io/projected/be492ff2-6fb7-4498-afed-6c25924fb0a9-kube-api-access-gpg2w\") pod \"community-operators-bbwsn\" (UID: \"be492ff2-6fb7-4498-afed-6c25924fb0a9\") " pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.429243 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be492ff2-6fb7-4498-afed-6c25924fb0a9-utilities\") pod \"community-operators-bbwsn\" (UID: \"be492ff2-6fb7-4498-afed-6c25924fb0a9\") " pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.429347 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be492ff2-6fb7-4498-afed-6c25924fb0a9-catalog-content\") pod \"community-operators-bbwsn\" (UID: \"be492ff2-6fb7-4498-afed-6c25924fb0a9\") " pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.532189 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpg2w\" (UniqueName: \"kubernetes.io/projected/be492ff2-6fb7-4498-afed-6c25924fb0a9-kube-api-access-gpg2w\") pod \"community-operators-bbwsn\" (UID: \"be492ff2-6fb7-4498-afed-6c25924fb0a9\") " pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.532252 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be492ff2-6fb7-4498-afed-6c25924fb0a9-utilities\") pod \"community-operators-bbwsn\" (UID: \"be492ff2-6fb7-4498-afed-6c25924fb0a9\") " pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.532295 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be492ff2-6fb7-4498-afed-6c25924fb0a9-catalog-content\") pod \"community-operators-bbwsn\" (UID: \"be492ff2-6fb7-4498-afed-6c25924fb0a9\") " pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.532869 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be492ff2-6fb7-4498-afed-6c25924fb0a9-catalog-content\") pod \"community-operators-bbwsn\" (UID: \"be492ff2-6fb7-4498-afed-6c25924fb0a9\") " pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.532917 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be492ff2-6fb7-4498-afed-6c25924fb0a9-utilities\") pod \"community-operators-bbwsn\" (UID: \"be492ff2-6fb7-4498-afed-6c25924fb0a9\") " pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.570669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpg2w\" (UniqueName: \"kubernetes.io/projected/be492ff2-6fb7-4498-afed-6c25924fb0a9-kube-api-access-gpg2w\") pod \"community-operators-bbwsn\" (UID: \"be492ff2-6fb7-4498-afed-6c25924fb0a9\") " pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.574707 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.685900 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" event={"ID":"c81f6ddd-5431-4a84-985b-5b13f91c44d5","Type":"ContainerStarted","Data":"bfabc8e3d47e32cdf496df5f3665c5b95e7ee92ebdc8b25e2578f2958857fe46"} Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.686321 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.697999 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eddb3def-0cd3-4d16-954a-dff2909e681f","Type":"ContainerStarted","Data":"1de0ffee11390bacd5f6c8504842d2424a2fd87872e5c3840710acba77b69daa"} Mar 20 15:15:32 crc kubenswrapper[4764]: I0320 15:15:32.718350 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" podStartSLOduration=2.718329798 podStartE2EDuration="2.718329798s" podCreationTimestamp="2026-03-20 15:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:15:32.708044114 +0000 UTC m=+1454.324233243" watchObservedRunningTime="2026-03-20 15:15:32.718329798 +0000 UTC m=+1454.334518927" Mar 20 15:15:33 crc kubenswrapper[4764]: I0320 15:15:33.072959 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bbwsn"] Mar 20 15:15:33 crc kubenswrapper[4764]: I0320 15:15:33.708951 4764 generic.go:334] "Generic (PLEG): container finished" podID="be492ff2-6fb7-4498-afed-6c25924fb0a9" containerID="c2017aa0a6790ba94858880e3f9f54f074de3bb0415da8230f25b2eef1f3e7d5" exitCode=0 Mar 20 15:15:33 crc kubenswrapper[4764]: I0320 15:15:33.709018 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbwsn" event={"ID":"be492ff2-6fb7-4498-afed-6c25924fb0a9","Type":"ContainerDied","Data":"c2017aa0a6790ba94858880e3f9f54f074de3bb0415da8230f25b2eef1f3e7d5"} Mar 20 15:15:33 crc kubenswrapper[4764]: I0320 15:15:33.709247 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbwsn" event={"ID":"be492ff2-6fb7-4498-afed-6c25924fb0a9","Type":"ContainerStarted","Data":"e9e51015b9a40145173f17cebc938f8b8ae9273174a14256be056f03376e4358"} Mar 20 15:15:33 crc kubenswrapper[4764]: I0320 15:15:33.711175 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:15:34 crc kubenswrapper[4764]: I0320 15:15:34.720927 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbwsn" event={"ID":"be492ff2-6fb7-4498-afed-6c25924fb0a9","Type":"ContainerStarted","Data":"64f06a952fec2ae4de5a96374e244b2164a10cf0931f0292cc6c932cca974130"} Mar 20 15:15:35 crc kubenswrapper[4764]: I0320 15:15:35.737230 4764 generic.go:334] "Generic (PLEG): container finished" podID="be492ff2-6fb7-4498-afed-6c25924fb0a9" containerID="64f06a952fec2ae4de5a96374e244b2164a10cf0931f0292cc6c932cca974130" exitCode=0 Mar 20 15:15:35 crc kubenswrapper[4764]: I0320 15:15:35.737293 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbwsn" event={"ID":"be492ff2-6fb7-4498-afed-6c25924fb0a9","Type":"ContainerDied","Data":"64f06a952fec2ae4de5a96374e244b2164a10cf0931f0292cc6c932cca974130"} Mar 20 15:15:37 crc kubenswrapper[4764]: I0320 15:15:37.754405 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbwsn" event={"ID":"be492ff2-6fb7-4498-afed-6c25924fb0a9","Type":"ContainerStarted","Data":"700beab650071655139973720dc5765a30af843fe5ec22d99ebf9bdb73ec94d3"} Mar 20 15:15:37 crc kubenswrapper[4764]: I0320 15:15:37.787071 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bbwsn" podStartSLOduration=2.887996711 podStartE2EDuration="5.787046768s" podCreationTimestamp="2026-03-20 15:15:32 +0000 UTC" firstStartedPulling="2026-03-20 15:15:33.710804239 +0000 UTC m=+1455.326993398" lastFinishedPulling="2026-03-20 15:15:36.609854316 +0000 UTC m=+1458.226043455" observedRunningTime="2026-03-20 15:15:37.778464126 +0000 UTC m=+1459.394653295" watchObservedRunningTime="2026-03-20 15:15:37.787046768 +0000 UTC m=+1459.403235937" Mar 20 15:15:40 crc kubenswrapper[4764]: I0320 15:15:40.634524 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:40 crc kubenswrapper[4764]: I0320 15:15:40.711771 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-gsbfb"] Mar 20 15:15:40 crc kubenswrapper[4764]: I0320 15:15:40.712041 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" podUID="53479118-a3ab-481a-b7f5-8ad3cdc1828e" containerName="dnsmasq-dns" containerID="cri-o://e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a" gracePeriod=10 Mar 20 15:15:40 crc kubenswrapper[4764]: I0320 15:15:40.902548 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-nlkxh"] Mar 20 15:15:40 crc kubenswrapper[4764]: I0320 15:15:40.905178 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:40 crc kubenswrapper[4764]: I0320 15:15:40.921554 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-nlkxh"] Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.000672 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-config\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.000714 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.000819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.000840 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.000884 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkpjl\" (UniqueName: \"kubernetes.io/projected/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-kube-api-access-jkpjl\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.000933 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.000969 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-dns-svc\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.102480 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.102544 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-dns-svc\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.102569 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-config\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.102588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.102651 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.102670 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.102709 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkpjl\" (UniqueName: \"kubernetes.io/projected/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-kube-api-access-jkpjl\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.103971 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.104722 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.104735 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.105225 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.105976 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-dns-svc\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.106132 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-config\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.134593 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkpjl\" (UniqueName: \"kubernetes.io/projected/4ab04f17-2a4f-4352-9ef0-8daf105eb96d-kube-api-access-jkpjl\") pod \"dnsmasq-dns-55478c4467-nlkxh\" (UID: \"4ab04f17-2a4f-4352-9ef0-8daf105eb96d\") " pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.225507 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.288065 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.417062 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-config\") pod \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.417407 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v7ht\" (UniqueName: \"kubernetes.io/projected/53479118-a3ab-481a-b7f5-8ad3cdc1828e-kube-api-access-7v7ht\") pod \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.417501 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-ovsdbserver-nb\") pod \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.417563 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-dns-svc\") pod \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.417613 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-dns-swift-storage-0\") pod \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.417706 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-ovsdbserver-sb\") pod \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\" (UID: \"53479118-a3ab-481a-b7f5-8ad3cdc1828e\") " Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.421163 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53479118-a3ab-481a-b7f5-8ad3cdc1828e-kube-api-access-7v7ht" (OuterVolumeSpecName: "kube-api-access-7v7ht") pod "53479118-a3ab-481a-b7f5-8ad3cdc1828e" (UID: "53479118-a3ab-481a-b7f5-8ad3cdc1828e"). InnerVolumeSpecName "kube-api-access-7v7ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.460581 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-config" (OuterVolumeSpecName: "config") pod "53479118-a3ab-481a-b7f5-8ad3cdc1828e" (UID: "53479118-a3ab-481a-b7f5-8ad3cdc1828e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.462737 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "53479118-a3ab-481a-b7f5-8ad3cdc1828e" (UID: "53479118-a3ab-481a-b7f5-8ad3cdc1828e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.471852 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53479118-a3ab-481a-b7f5-8ad3cdc1828e" (UID: "53479118-a3ab-481a-b7f5-8ad3cdc1828e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.473736 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53479118-a3ab-481a-b7f5-8ad3cdc1828e" (UID: "53479118-a3ab-481a-b7f5-8ad3cdc1828e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.477768 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53479118-a3ab-481a-b7f5-8ad3cdc1828e" (UID: "53479118-a3ab-481a-b7f5-8ad3cdc1828e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.519824 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.519851 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.519860 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v7ht\" (UniqueName: \"kubernetes.io/projected/53479118-a3ab-481a-b7f5-8ad3cdc1828e-kube-api-access-7v7ht\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.519871 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.519882 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.519891 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53479118-a3ab-481a-b7f5-8ad3cdc1828e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.704839 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-nlkxh"] Mar 20 15:15:41 crc kubenswrapper[4764]: W0320 15:15:41.711756 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ab04f17_2a4f_4352_9ef0_8daf105eb96d.slice/crio-2cef7e442c107f61cbcc85a35ec0017401328cb85ac2b4a0d7502a11ab66f973 WatchSource:0}: Error finding container 2cef7e442c107f61cbcc85a35ec0017401328cb85ac2b4a0d7502a11ab66f973: Status 404 returned error can't find the container with id 2cef7e442c107f61cbcc85a35ec0017401328cb85ac2b4a0d7502a11ab66f973 Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.809590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-nlkxh" event={"ID":"4ab04f17-2a4f-4352-9ef0-8daf105eb96d","Type":"ContainerStarted","Data":"2cef7e442c107f61cbcc85a35ec0017401328cb85ac2b4a0d7502a11ab66f973"} Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.811204 4764 generic.go:334] "Generic (PLEG): container finished" podID="53479118-a3ab-481a-b7f5-8ad3cdc1828e" containerID="e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a" exitCode=0 Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.811230 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" event={"ID":"53479118-a3ab-481a-b7f5-8ad3cdc1828e","Type":"ContainerDied","Data":"e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a"} Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.811248 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" event={"ID":"53479118-a3ab-481a-b7f5-8ad3cdc1828e","Type":"ContainerDied","Data":"fba15d33437f5d4d3102b9595ee458f8b304ac1360e3c3067da9e79cbded5e38"} Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.811266 4764 scope.go:117] "RemoveContainer" containerID="e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.811422 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-gsbfb" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.839166 4764 scope.go:117] "RemoveContainer" containerID="8a2d4b51982499a3c4e3d2694ee3f76551f4f2e113ce40a9180d049f0e6b7331" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.934925 4764 scope.go:117] "RemoveContainer" containerID="e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a" Mar 20 15:15:41 crc kubenswrapper[4764]: E0320 15:15:41.935932 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a\": container with ID starting with e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a not found: ID does not exist" containerID="e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.935967 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a"} err="failed to get container status \"e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a\": rpc error: code = NotFound desc = could not find container \"e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a\": container with ID starting with e8c80eb02e731a8c1bfe3289cbe30a3a37303af8bee70b74b915444f27fe7f9a not found: ID does not exist" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.935986 4764 scope.go:117] "RemoveContainer" containerID="8a2d4b51982499a3c4e3d2694ee3f76551f4f2e113ce40a9180d049f0e6b7331" Mar 20 15:15:41 crc kubenswrapper[4764]: E0320 15:15:41.936587 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2d4b51982499a3c4e3d2694ee3f76551f4f2e113ce40a9180d049f0e6b7331\": container with ID starting with 8a2d4b51982499a3c4e3d2694ee3f76551f4f2e113ce40a9180d049f0e6b7331 not found: ID does not exist" containerID="8a2d4b51982499a3c4e3d2694ee3f76551f4f2e113ce40a9180d049f0e6b7331" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.936607 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2d4b51982499a3c4e3d2694ee3f76551f4f2e113ce40a9180d049f0e6b7331"} err="failed to get container status \"8a2d4b51982499a3c4e3d2694ee3f76551f4f2e113ce40a9180d049f0e6b7331\": rpc error: code = NotFound desc = could not find container \"8a2d4b51982499a3c4e3d2694ee3f76551f4f2e113ce40a9180d049f0e6b7331\": container with ID starting with 8a2d4b51982499a3c4e3d2694ee3f76551f4f2e113ce40a9180d049f0e6b7331 not found: ID does not exist" Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.990532 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-gsbfb"] Mar 20 15:15:41 crc kubenswrapper[4764]: I0320 15:15:41.999653 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-gsbfb"] Mar 20 15:15:42 crc kubenswrapper[4764]: I0320 15:15:42.575250 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:42 crc kubenswrapper[4764]: I0320 15:15:42.575314 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:42 crc kubenswrapper[4764]: I0320 15:15:42.635549 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:42 crc kubenswrapper[4764]: I0320 15:15:42.832160 4764 generic.go:334] "Generic (PLEG): container finished" podID="4ab04f17-2a4f-4352-9ef0-8daf105eb96d" containerID="5fafd78fea86c0faa6446f5673e346b144b6d95028d1faa0910274b2fc50c53f" exitCode=0 Mar 20 15:15:42 crc kubenswrapper[4764]: I0320 15:15:42.832275 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-nlkxh" event={"ID":"4ab04f17-2a4f-4352-9ef0-8daf105eb96d","Type":"ContainerDied","Data":"5fafd78fea86c0faa6446f5673e346b144b6d95028d1faa0910274b2fc50c53f"} Mar 20 15:15:42 crc kubenswrapper[4764]: I0320 15:15:42.902239 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:43 crc kubenswrapper[4764]: I0320 15:15:43.153356 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53479118-a3ab-481a-b7f5-8ad3cdc1828e" path="/var/lib/kubelet/pods/53479118-a3ab-481a-b7f5-8ad3cdc1828e/volumes" Mar 20 15:15:43 crc kubenswrapper[4764]: I0320 15:15:43.845945 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-nlkxh" event={"ID":"4ab04f17-2a4f-4352-9ef0-8daf105eb96d","Type":"ContainerStarted","Data":"2172992c6165e38011f196278c1fd77106774eb9e2379d4e5db12a255d1b3a33"} Mar 20 15:15:43 crc kubenswrapper[4764]: I0320 15:15:43.893710 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-nlkxh" podStartSLOduration=3.893676176 podStartE2EDuration="3.893676176s" podCreationTimestamp="2026-03-20 15:15:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:15:43.881294047 +0000 UTC m=+1465.497483176" watchObservedRunningTime="2026-03-20 15:15:43.893676176 +0000 UTC m=+1465.509865325" Mar 20 15:15:44 crc kubenswrapper[4764]: I0320 15:15:44.870428 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.618695 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xn5t6"] Mar 20 15:15:46 crc kubenswrapper[4764]: E0320 15:15:46.619418 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53479118-a3ab-481a-b7f5-8ad3cdc1828e" containerName="dnsmasq-dns" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.619430 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="53479118-a3ab-481a-b7f5-8ad3cdc1828e" containerName="dnsmasq-dns" Mar 20 15:15:46 crc kubenswrapper[4764]: E0320 15:15:46.619448 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53479118-a3ab-481a-b7f5-8ad3cdc1828e" containerName="init" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.619454 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="53479118-a3ab-481a-b7f5-8ad3cdc1828e" containerName="init" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.619628 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="53479118-a3ab-481a-b7f5-8ad3cdc1828e" containerName="dnsmasq-dns" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.620999 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.641160 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xn5t6"] Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.744495 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-utilities\") pod \"redhat-marketplace-xn5t6\" (UID: \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\") " pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.744955 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqc2n\" (UniqueName: \"kubernetes.io/projected/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-kube-api-access-cqc2n\") pod \"redhat-marketplace-xn5t6\" (UID: \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\") " pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.745035 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-catalog-content\") pod \"redhat-marketplace-xn5t6\" (UID: \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\") " pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.791666 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jsp8s"] Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.794704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.812943 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jsp8s"] Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.847161 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqc2n\" (UniqueName: \"kubernetes.io/projected/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-kube-api-access-cqc2n\") pod \"redhat-marketplace-xn5t6\" (UID: \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\") " pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.847211 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-catalog-content\") pod \"redhat-marketplace-xn5t6\" (UID: \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\") " pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.847246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-utilities\") pod \"redhat-marketplace-xn5t6\" (UID: \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\") " pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.847735 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-utilities\") pod \"redhat-marketplace-xn5t6\" (UID: \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\") " pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.848245 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-catalog-content\") pod \"redhat-marketplace-xn5t6\" (UID: \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\") " pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.866620 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqc2n\" (UniqueName: \"kubernetes.io/projected/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-kube-api-access-cqc2n\") pod \"redhat-marketplace-xn5t6\" (UID: \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\") " pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.948962 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-utilities\") pod \"redhat-operators-jsp8s\" (UID: \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\") " pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.949391 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.949533 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-catalog-content\") pod \"redhat-operators-jsp8s\" (UID: \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\") " pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:46 crc kubenswrapper[4764]: I0320 15:15:46.949834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf56f\" (UniqueName: \"kubernetes.io/projected/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-kube-api-access-gf56f\") pod \"redhat-operators-jsp8s\" (UID: \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\") " pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.051812 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-catalog-content\") pod \"redhat-operators-jsp8s\" (UID: \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\") " pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.052152 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf56f\" (UniqueName: \"kubernetes.io/projected/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-kube-api-access-gf56f\") pod \"redhat-operators-jsp8s\" (UID: \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\") " pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.052246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-utilities\") pod \"redhat-operators-jsp8s\" (UID: \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\") " pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.052889 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-utilities\") pod \"redhat-operators-jsp8s\" (UID: \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\") " pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.053251 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-catalog-content\") pod \"redhat-operators-jsp8s\" (UID: \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\") " pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.082882 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf56f\" (UniqueName: \"kubernetes.io/projected/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-kube-api-access-gf56f\") pod \"redhat-operators-jsp8s\" (UID: \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\") " pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.111003 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.437914 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xn5t6"] Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.609356 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jsp8s"] Mar 20 15:15:47 crc kubenswrapper[4764]: W0320 15:15:47.610043 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc4bf2b3_90f1_4faf_8e98_16004adc36ef.slice/crio-78f79add4ed7bcbe87e1f78abe611c2fbf60578668eb0123a78e9464ce311c90 WatchSource:0}: Error finding container 78f79add4ed7bcbe87e1f78abe611c2fbf60578668eb0123a78e9464ce311c90: Status 404 returned error can't find the container with id 78f79add4ed7bcbe87e1f78abe611c2fbf60578668eb0123a78e9464ce311c90 Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.906766 4764 generic.go:334] "Generic (PLEG): container finished" podID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerID="2a3be1b6376743ccc9f6f8e20640bcc7acd2c34568522e27ff73ff9b0f1c5932" exitCode=0 Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.906851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsp8s" event={"ID":"cc4bf2b3-90f1-4faf-8e98-16004adc36ef","Type":"ContainerDied","Data":"2a3be1b6376743ccc9f6f8e20640bcc7acd2c34568522e27ff73ff9b0f1c5932"} Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.906885 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsp8s" event={"ID":"cc4bf2b3-90f1-4faf-8e98-16004adc36ef","Type":"ContainerStarted","Data":"78f79add4ed7bcbe87e1f78abe611c2fbf60578668eb0123a78e9464ce311c90"} Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.909876 4764 generic.go:334] "Generic (PLEG): container finished" podID="d2b77c3b-7b11-4ea8-9127-5b896c183c7e" containerID="d9415d39eeb8b052770796e95cc32879551f0a21208e04cc89fea6e67abda804" exitCode=0 Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.909914 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn5t6" event={"ID":"d2b77c3b-7b11-4ea8-9127-5b896c183c7e","Type":"ContainerDied","Data":"d9415d39eeb8b052770796e95cc32879551f0a21208e04cc89fea6e67abda804"} Mar 20 15:15:47 crc kubenswrapper[4764]: I0320 15:15:47.909936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn5t6" event={"ID":"d2b77c3b-7b11-4ea8-9127-5b896c183c7e","Type":"ContainerStarted","Data":"afbefff53c5cfda4040a50c553a943181e54a50f2f1ebebead20ac8031fb4c31"} Mar 20 15:15:48 crc kubenswrapper[4764]: I0320 15:15:48.924676 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn5t6" event={"ID":"d2b77c3b-7b11-4ea8-9127-5b896c183c7e","Type":"ContainerStarted","Data":"a580627a8481edd42b5a8acc19b2b640b85b343e383fcf88eb49630ee1835401"} Mar 20 15:15:48 crc kubenswrapper[4764]: I0320 15:15:48.934204 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsp8s" event={"ID":"cc4bf2b3-90f1-4faf-8e98-16004adc36ef","Type":"ContainerStarted","Data":"88b6c22efa7fc0ccb5772b578bdfdd6546ddbcfe3a3be1b1a5f3ecb74641465e"} Mar 20 15:15:49 crc kubenswrapper[4764]: I0320 15:15:49.946130 4764 generic.go:334] "Generic (PLEG): container finished" podID="d2b77c3b-7b11-4ea8-9127-5b896c183c7e" containerID="a580627a8481edd42b5a8acc19b2b640b85b343e383fcf88eb49630ee1835401" exitCode=0 Mar 20 15:15:49 crc kubenswrapper[4764]: I0320 15:15:49.947716 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn5t6" event={"ID":"d2b77c3b-7b11-4ea8-9127-5b896c183c7e","Type":"ContainerDied","Data":"a580627a8481edd42b5a8acc19b2b640b85b343e383fcf88eb49630ee1835401"} Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.405705 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zhfsz"] Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.411993 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.421121 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zhfsz"] Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.522091 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb751f5-9184-4f6f-8260-663248c52af3-utilities\") pod \"certified-operators-zhfsz\" (UID: \"ebb751f5-9184-4f6f-8260-663248c52af3\") " pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.522367 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rjz7\" (UniqueName: \"kubernetes.io/projected/ebb751f5-9184-4f6f-8260-663248c52af3-kube-api-access-6rjz7\") pod \"certified-operators-zhfsz\" (UID: \"ebb751f5-9184-4f6f-8260-663248c52af3\") " pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.522549 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb751f5-9184-4f6f-8260-663248c52af3-catalog-content\") pod \"certified-operators-zhfsz\" (UID: \"ebb751f5-9184-4f6f-8260-663248c52af3\") " pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.624795 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb751f5-9184-4f6f-8260-663248c52af3-utilities\") pod \"certified-operators-zhfsz\" (UID: \"ebb751f5-9184-4f6f-8260-663248c52af3\") " pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.625005 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rjz7\" (UniqueName: \"kubernetes.io/projected/ebb751f5-9184-4f6f-8260-663248c52af3-kube-api-access-6rjz7\") pod \"certified-operators-zhfsz\" (UID: \"ebb751f5-9184-4f6f-8260-663248c52af3\") " pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.625063 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb751f5-9184-4f6f-8260-663248c52af3-catalog-content\") pod \"certified-operators-zhfsz\" (UID: \"ebb751f5-9184-4f6f-8260-663248c52af3\") " pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.625733 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb751f5-9184-4f6f-8260-663248c52af3-catalog-content\") pod \"certified-operators-zhfsz\" (UID: \"ebb751f5-9184-4f6f-8260-663248c52af3\") " pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.625751 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb751f5-9184-4f6f-8260-663248c52af3-utilities\") pod \"certified-operators-zhfsz\" (UID: \"ebb751f5-9184-4f6f-8260-663248c52af3\") " pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.648536 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rjz7\" (UniqueName: \"kubernetes.io/projected/ebb751f5-9184-4f6f-8260-663248c52af3-kube-api-access-6rjz7\") pod \"certified-operators-zhfsz\" (UID: \"ebb751f5-9184-4f6f-8260-663248c52af3\") " pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.730103 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.986744 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bbwsn"] Mar 20 15:15:50 crc kubenswrapper[4764]: I0320 15:15:50.992811 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bbwsn" podUID="be492ff2-6fb7-4498-afed-6c25924fb0a9" containerName="registry-server" containerID="cri-o://700beab650071655139973720dc5765a30af843fe5ec22d99ebf9bdb73ec94d3" gracePeriod=2 Mar 20 15:15:51 crc kubenswrapper[4764]: I0320 15:15:51.227570 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-nlkxh" Mar 20 15:15:51 crc kubenswrapper[4764]: I0320 15:15:51.274655 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zhfsz"] Mar 20 15:15:51 crc kubenswrapper[4764]: W0320 15:15:51.275356 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebb751f5_9184_4f6f_8260_663248c52af3.slice/crio-09c5be3ef7350fecb40e3f7e499beac8e6e0bb30522b9edd13e18568239af078 WatchSource:0}: Error finding container 09c5be3ef7350fecb40e3f7e499beac8e6e0bb30522b9edd13e18568239af078: Status 404 returned error can't find the container with id 09c5be3ef7350fecb40e3f7e499beac8e6e0bb30522b9edd13e18568239af078 Mar 20 15:15:51 crc kubenswrapper[4764]: I0320 15:15:51.292621 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-29ltk"] Mar 20 15:15:51 crc kubenswrapper[4764]: I0320 15:15:51.292860 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" podUID="c81f6ddd-5431-4a84-985b-5b13f91c44d5" containerName="dnsmasq-dns" containerID="cri-o://bfabc8e3d47e32cdf496df5f3665c5b95e7ee92ebdc8b25e2578f2958857fe46" gracePeriod=10 Mar 20 15:15:51 crc kubenswrapper[4764]: I0320 15:15:51.992608 4764 generic.go:334] "Generic (PLEG): container finished" podID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerID="88b6c22efa7fc0ccb5772b578bdfdd6546ddbcfe3a3be1b1a5f3ecb74641465e" exitCode=0 Mar 20 15:15:51 crc kubenswrapper[4764]: I0320 15:15:51.992674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsp8s" event={"ID":"cc4bf2b3-90f1-4faf-8e98-16004adc36ef","Type":"ContainerDied","Data":"88b6c22efa7fc0ccb5772b578bdfdd6546ddbcfe3a3be1b1a5f3ecb74641465e"} Mar 20 15:15:51 crc kubenswrapper[4764]: I0320 15:15:51.996840 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn5t6" event={"ID":"d2b77c3b-7b11-4ea8-9127-5b896c183c7e","Type":"ContainerStarted","Data":"9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080"} Mar 20 15:15:51 crc kubenswrapper[4764]: I0320 15:15:51.998977 4764 generic.go:334] "Generic (PLEG): container finished" podID="c81f6ddd-5431-4a84-985b-5b13f91c44d5" containerID="bfabc8e3d47e32cdf496df5f3665c5b95e7ee92ebdc8b25e2578f2958857fe46" exitCode=0 Mar 20 15:15:51 crc kubenswrapper[4764]: I0320 15:15:51.999053 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" event={"ID":"c81f6ddd-5431-4a84-985b-5b13f91c44d5","Type":"ContainerDied","Data":"bfabc8e3d47e32cdf496df5f3665c5b95e7ee92ebdc8b25e2578f2958857fe46"} Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.001751 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhfsz" event={"ID":"ebb751f5-9184-4f6f-8260-663248c52af3","Type":"ContainerStarted","Data":"420c91b0a587d0ff0d77cc651672cbcedba9bbe9ad31483d707b960fc346affd"} Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.001784 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhfsz" event={"ID":"ebb751f5-9184-4f6f-8260-663248c52af3","Type":"ContainerStarted","Data":"09c5be3ef7350fecb40e3f7e499beac8e6e0bb30522b9edd13e18568239af078"} Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.060703 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xn5t6" podStartSLOduration=3.212876494 podStartE2EDuration="6.060683677s" podCreationTimestamp="2026-03-20 15:15:46 +0000 UTC" firstStartedPulling="2026-03-20 15:15:47.912131259 +0000 UTC m=+1469.528320388" lastFinishedPulling="2026-03-20 15:15:50.759938442 +0000 UTC m=+1472.376127571" observedRunningTime="2026-03-20 15:15:52.056886831 +0000 UTC m=+1473.673075950" watchObservedRunningTime="2026-03-20 15:15:52.060683677 +0000 UTC m=+1473.676872816" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.450095 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.559107 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-ovsdbserver-nb\") pod \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.559164 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-dns-svc\") pod \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.559193 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-ovsdbserver-sb\") pod \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.559225 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-config\") pod \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.559248 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-openstack-edpm-ipam\") pod \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.559301 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47p5h\" (UniqueName: \"kubernetes.io/projected/c81f6ddd-5431-4a84-985b-5b13f91c44d5-kube-api-access-47p5h\") pod \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.559423 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-dns-swift-storage-0\") pod \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\" (UID: \"c81f6ddd-5431-4a84-985b-5b13f91c44d5\") " Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.565862 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c81f6ddd-5431-4a84-985b-5b13f91c44d5-kube-api-access-47p5h" (OuterVolumeSpecName: "kube-api-access-47p5h") pod "c81f6ddd-5431-4a84-985b-5b13f91c44d5" (UID: "c81f6ddd-5431-4a84-985b-5b13f91c44d5"). InnerVolumeSpecName "kube-api-access-47p5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:15:52 crc kubenswrapper[4764]: E0320 15:15:52.577671 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 700beab650071655139973720dc5765a30af843fe5ec22d99ebf9bdb73ec94d3 is running failed: container process not found" containerID="700beab650071655139973720dc5765a30af843fe5ec22d99ebf9bdb73ec94d3" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 15:15:52 crc kubenswrapper[4764]: E0320 15:15:52.578012 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 700beab650071655139973720dc5765a30af843fe5ec22d99ebf9bdb73ec94d3 is running failed: container process not found" containerID="700beab650071655139973720dc5765a30af843fe5ec22d99ebf9bdb73ec94d3" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 15:15:52 crc kubenswrapper[4764]: E0320 15:15:52.578263 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 700beab650071655139973720dc5765a30af843fe5ec22d99ebf9bdb73ec94d3 is running failed: container process not found" containerID="700beab650071655139973720dc5765a30af843fe5ec22d99ebf9bdb73ec94d3" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 15:15:52 crc kubenswrapper[4764]: E0320 15:15:52.578286 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 700beab650071655139973720dc5765a30af843fe5ec22d99ebf9bdb73ec94d3 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-bbwsn" podUID="be492ff2-6fb7-4498-afed-6c25924fb0a9" containerName="registry-server" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.615661 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c81f6ddd-5431-4a84-985b-5b13f91c44d5" (UID: "c81f6ddd-5431-4a84-985b-5b13f91c44d5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.623910 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c81f6ddd-5431-4a84-985b-5b13f91c44d5" (UID: "c81f6ddd-5431-4a84-985b-5b13f91c44d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.624941 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-config" (OuterVolumeSpecName: "config") pod "c81f6ddd-5431-4a84-985b-5b13f91c44d5" (UID: "c81f6ddd-5431-4a84-985b-5b13f91c44d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.626793 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c81f6ddd-5431-4a84-985b-5b13f91c44d5" (UID: "c81f6ddd-5431-4a84-985b-5b13f91c44d5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.627135 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "c81f6ddd-5431-4a84-985b-5b13f91c44d5" (UID: "c81f6ddd-5431-4a84-985b-5b13f91c44d5"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.632488 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c81f6ddd-5431-4a84-985b-5b13f91c44d5" (UID: "c81f6ddd-5431-4a84-985b-5b13f91c44d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.661942 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.661973 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.661986 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47p5h\" (UniqueName: \"kubernetes.io/projected/c81f6ddd-5431-4a84-985b-5b13f91c44d5-kube-api-access-47p5h\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.661995 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.662004 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.662013 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:52 crc kubenswrapper[4764]: I0320 15:15:52.662021 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c81f6ddd-5431-4a84-985b-5b13f91c44d5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:53 crc kubenswrapper[4764]: I0320 15:15:53.011661 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" event={"ID":"c81f6ddd-5431-4a84-985b-5b13f91c44d5","Type":"ContainerDied","Data":"a6995c488a7af2eb9390e3157393218fcb740ed0d263e85a0dc06760388e65a7"} Mar 20 15:15:53 crc kubenswrapper[4764]: I0320 15:15:53.011723 4764 scope.go:117] "RemoveContainer" containerID="bfabc8e3d47e32cdf496df5f3665c5b95e7ee92ebdc8b25e2578f2958857fe46" Mar 20 15:15:53 crc kubenswrapper[4764]: I0320 15:15:53.011896 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-29ltk" Mar 20 15:15:53 crc kubenswrapper[4764]: I0320 15:15:53.045516 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-29ltk"] Mar 20 15:15:53 crc kubenswrapper[4764]: I0320 15:15:53.052444 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-29ltk"] Mar 20 15:15:53 crc kubenswrapper[4764]: I0320 15:15:53.141671 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c81f6ddd-5431-4a84-985b-5b13f91c44d5" path="/var/lib/kubelet/pods/c81f6ddd-5431-4a84-985b-5b13f91c44d5/volumes" Mar 20 15:15:53 crc kubenswrapper[4764]: I0320 15:15:53.216432 4764 scope.go:117] "RemoveContainer" containerID="cf968ed2bcf0d078ff98c6bc1b46974849085c25dcd5a5473b73f976fc2e49a4" Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.021042 4764 generic.go:334] "Generic (PLEG): container finished" podID="ebb751f5-9184-4f6f-8260-663248c52af3" containerID="420c91b0a587d0ff0d77cc651672cbcedba9bbe9ad31483d707b960fc346affd" exitCode=0 Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.021121 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhfsz" event={"ID":"ebb751f5-9184-4f6f-8260-663248c52af3","Type":"ContainerDied","Data":"420c91b0a587d0ff0d77cc651672cbcedba9bbe9ad31483d707b960fc346affd"} Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.027819 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsp8s" event={"ID":"cc4bf2b3-90f1-4faf-8e98-16004adc36ef","Type":"ContainerStarted","Data":"de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79"} Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.032261 4764 generic.go:334] "Generic (PLEG): container finished" podID="be492ff2-6fb7-4498-afed-6c25924fb0a9" containerID="700beab650071655139973720dc5765a30af843fe5ec22d99ebf9bdb73ec94d3" exitCode=0 Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.032777 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbwsn" event={"ID":"be492ff2-6fb7-4498-afed-6c25924fb0a9","Type":"ContainerDied","Data":"700beab650071655139973720dc5765a30af843fe5ec22d99ebf9bdb73ec94d3"} Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.066896 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jsp8s" podStartSLOduration=2.654453581 podStartE2EDuration="8.066873767s" podCreationTimestamp="2026-03-20 15:15:46 +0000 UTC" firstStartedPulling="2026-03-20 15:15:47.909063796 +0000 UTC m=+1469.525252925" lastFinishedPulling="2026-03-20 15:15:53.321483982 +0000 UTC m=+1474.937673111" observedRunningTime="2026-03-20 15:15:54.060161461 +0000 UTC m=+1475.676350590" watchObservedRunningTime="2026-03-20 15:15:54.066873767 +0000 UTC m=+1475.683062916" Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.211771 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.299419 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be492ff2-6fb7-4498-afed-6c25924fb0a9-utilities\") pod \"be492ff2-6fb7-4498-afed-6c25924fb0a9\" (UID: \"be492ff2-6fb7-4498-afed-6c25924fb0a9\") " Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.299501 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpg2w\" (UniqueName: \"kubernetes.io/projected/be492ff2-6fb7-4498-afed-6c25924fb0a9-kube-api-access-gpg2w\") pod \"be492ff2-6fb7-4498-afed-6c25924fb0a9\" (UID: \"be492ff2-6fb7-4498-afed-6c25924fb0a9\") " Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.299612 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be492ff2-6fb7-4498-afed-6c25924fb0a9-catalog-content\") pod \"be492ff2-6fb7-4498-afed-6c25924fb0a9\" (UID: \"be492ff2-6fb7-4498-afed-6c25924fb0a9\") " Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.300273 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be492ff2-6fb7-4498-afed-6c25924fb0a9-utilities" (OuterVolumeSpecName: "utilities") pod "be492ff2-6fb7-4498-afed-6c25924fb0a9" (UID: "be492ff2-6fb7-4498-afed-6c25924fb0a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.318900 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be492ff2-6fb7-4498-afed-6c25924fb0a9-kube-api-access-gpg2w" (OuterVolumeSpecName: "kube-api-access-gpg2w") pod "be492ff2-6fb7-4498-afed-6c25924fb0a9" (UID: "be492ff2-6fb7-4498-afed-6c25924fb0a9"). InnerVolumeSpecName "kube-api-access-gpg2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.346300 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be492ff2-6fb7-4498-afed-6c25924fb0a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be492ff2-6fb7-4498-afed-6c25924fb0a9" (UID: "be492ff2-6fb7-4498-afed-6c25924fb0a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.401252 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be492ff2-6fb7-4498-afed-6c25924fb0a9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.401283 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpg2w\" (UniqueName: \"kubernetes.io/projected/be492ff2-6fb7-4498-afed-6c25924fb0a9-kube-api-access-gpg2w\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:54 crc kubenswrapper[4764]: I0320 15:15:54.401295 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be492ff2-6fb7-4498-afed-6c25924fb0a9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:15:55 crc kubenswrapper[4764]: I0320 15:15:55.045576 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhfsz" event={"ID":"ebb751f5-9184-4f6f-8260-663248c52af3","Type":"ContainerStarted","Data":"b9c4ae3ca846ce2f299fec62e6e1f36b8f93dbe87fe24fff26098cf637c530d3"} Mar 20 15:15:55 crc kubenswrapper[4764]: I0320 15:15:55.049136 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbwsn" event={"ID":"be492ff2-6fb7-4498-afed-6c25924fb0a9","Type":"ContainerDied","Data":"e9e51015b9a40145173f17cebc938f8b8ae9273174a14256be056f03376e4358"} Mar 20 15:15:55 crc kubenswrapper[4764]: I0320 15:15:55.049184 4764 scope.go:117] "RemoveContainer" containerID="700beab650071655139973720dc5765a30af843fe5ec22d99ebf9bdb73ec94d3" Mar 20 15:15:55 crc kubenswrapper[4764]: I0320 15:15:55.049205 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbwsn" Mar 20 15:15:55 crc kubenswrapper[4764]: I0320 15:15:55.084007 4764 scope.go:117] "RemoveContainer" containerID="64f06a952fec2ae4de5a96374e244b2164a10cf0931f0292cc6c932cca974130" Mar 20 15:15:55 crc kubenswrapper[4764]: I0320 15:15:55.109755 4764 scope.go:117] "RemoveContainer" containerID="c2017aa0a6790ba94858880e3f9f54f074de3bb0415da8230f25b2eef1f3e7d5" Mar 20 15:15:55 crc kubenswrapper[4764]: I0320 15:15:55.113280 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bbwsn"] Mar 20 15:15:55 crc kubenswrapper[4764]: I0320 15:15:55.122094 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bbwsn"] Mar 20 15:15:55 crc kubenswrapper[4764]: I0320 15:15:55.137298 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be492ff2-6fb7-4498-afed-6c25924fb0a9" path="/var/lib/kubelet/pods/be492ff2-6fb7-4498-afed-6c25924fb0a9/volumes" Mar 20 15:15:57 crc kubenswrapper[4764]: I0320 15:15:57.742571 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:57 crc kubenswrapper[4764]: I0320 15:15:57.746398 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:57 crc kubenswrapper[4764]: I0320 15:15:57.757836 4764 generic.go:334] "Generic (PLEG): container finished" podID="ebb751f5-9184-4f6f-8260-663248c52af3" containerID="b9c4ae3ca846ce2f299fec62e6e1f36b8f93dbe87fe24fff26098cf637c530d3" exitCode=0 Mar 20 15:15:57 crc kubenswrapper[4764]: I0320 15:15:57.769949 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhfsz" event={"ID":"ebb751f5-9184-4f6f-8260-663248c52af3","Type":"ContainerDied","Data":"b9c4ae3ca846ce2f299fec62e6e1f36b8f93dbe87fe24fff26098cf637c530d3"} Mar 20 15:15:57 crc kubenswrapper[4764]: I0320 15:15:57.770879 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:57 crc kubenswrapper[4764]: I0320 15:15:57.774592 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:15:57 crc kubenswrapper[4764]: I0320 15:15:57.805989 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:15:58 crc kubenswrapper[4764]: I0320 15:15:58.782016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhfsz" event={"ID":"ebb751f5-9184-4f6f-8260-663248c52af3","Type":"ContainerStarted","Data":"19fcb6c00659629ba95fc38c65aba47d9a992f6d9e18653a1c93299a50ac32f8"} Mar 20 15:15:58 crc kubenswrapper[4764]: I0320 15:15:58.807560 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jsp8s" podUID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerName="registry-server" probeResult="failure" output=< Mar 20 15:15:58 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 20 15:15:58 crc kubenswrapper[4764]: > Mar 20 15:15:58 crc kubenswrapper[4764]: I0320 15:15:58.808228 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zhfsz" podStartSLOduration=4.624084783 podStartE2EDuration="8.808210677s" podCreationTimestamp="2026-03-20 15:15:50 +0000 UTC" firstStartedPulling="2026-03-20 15:15:54.023289395 +0000 UTC m=+1475.639478544" lastFinishedPulling="2026-03-20 15:15:58.207415309 +0000 UTC m=+1479.823604438" observedRunningTime="2026-03-20 15:15:58.801928715 +0000 UTC m=+1480.418117854" watchObservedRunningTime="2026-03-20 15:15:58.808210677 +0000 UTC m=+1480.424399816" Mar 20 15:15:58 crc kubenswrapper[4764]: I0320 15:15:58.839209 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.145943 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566996-cz9h9"] Mar 20 15:16:00 crc kubenswrapper[4764]: E0320 15:16:00.146930 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be492ff2-6fb7-4498-afed-6c25924fb0a9" containerName="extract-utilities" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.146964 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be492ff2-6fb7-4498-afed-6c25924fb0a9" containerName="extract-utilities" Mar 20 15:16:00 crc kubenswrapper[4764]: E0320 15:16:00.146993 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81f6ddd-5431-4a84-985b-5b13f91c44d5" containerName="dnsmasq-dns" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.147009 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81f6ddd-5431-4a84-985b-5b13f91c44d5" containerName="dnsmasq-dns" Mar 20 15:16:00 crc kubenswrapper[4764]: E0320 15:16:00.147054 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81f6ddd-5431-4a84-985b-5b13f91c44d5" containerName="init" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.147074 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81f6ddd-5431-4a84-985b-5b13f91c44d5" containerName="init" Mar 20 15:16:00 crc kubenswrapper[4764]: E0320 15:16:00.147111 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be492ff2-6fb7-4498-afed-6c25924fb0a9" containerName="extract-content" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.147128 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be492ff2-6fb7-4498-afed-6c25924fb0a9" containerName="extract-content" Mar 20 15:16:00 crc kubenswrapper[4764]: E0320 15:16:00.147172 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be492ff2-6fb7-4498-afed-6c25924fb0a9" containerName="registry-server" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.147189 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="be492ff2-6fb7-4498-afed-6c25924fb0a9" containerName="registry-server" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.147570 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c81f6ddd-5431-4a84-985b-5b13f91c44d5" containerName="dnsmasq-dns" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.147614 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="be492ff2-6fb7-4498-afed-6c25924fb0a9" containerName="registry-server" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.148884 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566996-cz9h9" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.150822 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.152124 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.156965 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.157078 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566996-cz9h9"] Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.275201 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkzsc\" (UniqueName: \"kubernetes.io/projected/08d3f786-2402-4bee-ba83-c97c9c8b2209-kube-api-access-xkzsc\") pod \"auto-csr-approver-29566996-cz9h9\" (UID: \"08d3f786-2402-4bee-ba83-c97c9c8b2209\") " pod="openshift-infra/auto-csr-approver-29566996-cz9h9" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.377082 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkzsc\" (UniqueName: \"kubernetes.io/projected/08d3f786-2402-4bee-ba83-c97c9c8b2209-kube-api-access-xkzsc\") pod \"auto-csr-approver-29566996-cz9h9\" (UID: \"08d3f786-2402-4bee-ba83-c97c9c8b2209\") " pod="openshift-infra/auto-csr-approver-29566996-cz9h9" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.413495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkzsc\" (UniqueName: \"kubernetes.io/projected/08d3f786-2402-4bee-ba83-c97c9c8b2209-kube-api-access-xkzsc\") pod \"auto-csr-approver-29566996-cz9h9\" (UID: \"08d3f786-2402-4bee-ba83-c97c9c8b2209\") " pod="openshift-infra/auto-csr-approver-29566996-cz9h9" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.465539 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566996-cz9h9" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.730543 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.733590 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.794693 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xn5t6"] Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.796802 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:16:00 crc kubenswrapper[4764]: I0320 15:16:00.961432 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566996-cz9h9"] Mar 20 15:16:00 crc kubenswrapper[4764]: W0320 15:16:00.961966 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08d3f786_2402_4bee_ba83_c97c9c8b2209.slice/crio-5ef465a26d6bb2f2ae0cde4a5c195f3c52ce3b58d1168f1a4fae47d9df6e228a WatchSource:0}: Error finding container 5ef465a26d6bb2f2ae0cde4a5c195f3c52ce3b58d1168f1a4fae47d9df6e228a: Status 404 returned error can't find the container with id 5ef465a26d6bb2f2ae0cde4a5c195f3c52ce3b58d1168f1a4fae47d9df6e228a Mar 20 15:16:01 crc kubenswrapper[4764]: I0320 15:16:01.812710 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566996-cz9h9" event={"ID":"08d3f786-2402-4bee-ba83-c97c9c8b2209","Type":"ContainerStarted","Data":"5ef465a26d6bb2f2ae0cde4a5c195f3c52ce3b58d1168f1a4fae47d9df6e228a"} Mar 20 15:16:01 crc kubenswrapper[4764]: I0320 15:16:01.812978 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xn5t6" podUID="d2b77c3b-7b11-4ea8-9127-5b896c183c7e" containerName="registry-server" containerID="cri-o://9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080" gracePeriod=2 Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.303196 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.417514 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-catalog-content\") pod \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\" (UID: \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\") " Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.417627 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqc2n\" (UniqueName: \"kubernetes.io/projected/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-kube-api-access-cqc2n\") pod \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\" (UID: \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\") " Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.417780 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-utilities\") pod \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\" (UID: \"d2b77c3b-7b11-4ea8-9127-5b896c183c7e\") " Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.418688 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-utilities" (OuterVolumeSpecName: "utilities") pod "d2b77c3b-7b11-4ea8-9127-5b896c183c7e" (UID: "d2b77c3b-7b11-4ea8-9127-5b896c183c7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.424941 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-kube-api-access-cqc2n" (OuterVolumeSpecName: "kube-api-access-cqc2n") pod "d2b77c3b-7b11-4ea8-9127-5b896c183c7e" (UID: "d2b77c3b-7b11-4ea8-9127-5b896c183c7e"). InnerVolumeSpecName "kube-api-access-cqc2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.444267 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2b77c3b-7b11-4ea8-9127-5b896c183c7e" (UID: "d2b77c3b-7b11-4ea8-9127-5b896c183c7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.520173 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.520206 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqc2n\" (UniqueName: \"kubernetes.io/projected/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-kube-api-access-cqc2n\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.520217 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b77c3b-7b11-4ea8-9127-5b896c183c7e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.825358 4764 generic.go:334] "Generic (PLEG): container finished" podID="c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f" containerID="01fe0ed2dd032ce64b877fab7bd786361a36be45bdd855cee07181e2b829fe8c" exitCode=0 Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.825468 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f","Type":"ContainerDied","Data":"01fe0ed2dd032ce64b877fab7bd786361a36be45bdd855cee07181e2b829fe8c"} Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.831308 4764 generic.go:334] "Generic (PLEG): container finished" podID="08d3f786-2402-4bee-ba83-c97c9c8b2209" containerID="22ec6044cdf531fba5b90673244c98c7b116936f8515281f0071eec8049aba68" exitCode=0 Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.831467 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566996-cz9h9" event={"ID":"08d3f786-2402-4bee-ba83-c97c9c8b2209","Type":"ContainerDied","Data":"22ec6044cdf531fba5b90673244c98c7b116936f8515281f0071eec8049aba68"} Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.837967 4764 generic.go:334] "Generic (PLEG): container finished" podID="d2b77c3b-7b11-4ea8-9127-5b896c183c7e" containerID="9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080" exitCode=0 Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.838021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn5t6" event={"ID":"d2b77c3b-7b11-4ea8-9127-5b896c183c7e","Type":"ContainerDied","Data":"9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080"} Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.838053 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xn5t6" event={"ID":"d2b77c3b-7b11-4ea8-9127-5b896c183c7e","Type":"ContainerDied","Data":"afbefff53c5cfda4040a50c553a943181e54a50f2f1ebebead20ac8031fb4c31"} Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.838082 4764 scope.go:117] "RemoveContainer" containerID="9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.838275 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xn5t6" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.864723 4764 scope.go:117] "RemoveContainer" containerID="a580627a8481edd42b5a8acc19b2b640b85b343e383fcf88eb49630ee1835401" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.903184 4764 scope.go:117] "RemoveContainer" containerID="d9415d39eeb8b052770796e95cc32879551f0a21208e04cc89fea6e67abda804" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.903451 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xn5t6"] Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.938964 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xn5t6"] Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.958557 4764 scope.go:117] "RemoveContainer" containerID="9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080" Mar 20 15:16:02 crc kubenswrapper[4764]: E0320 15:16:02.959136 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080\": container with ID starting with 9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080 not found: ID does not exist" containerID="9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.959292 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080"} err="failed to get container status \"9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080\": rpc error: code = NotFound desc = could not find container \"9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080\": container with ID starting with 9ee792e456347e478cecad2b7f8f9b7ae6ea1e7d9b2cdb4d66a9a6c405cb2080 not found: ID does not exist" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.959389 4764 scope.go:117] "RemoveContainer" containerID="a580627a8481edd42b5a8acc19b2b640b85b343e383fcf88eb49630ee1835401" Mar 20 15:16:02 crc kubenswrapper[4764]: E0320 15:16:02.960567 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a580627a8481edd42b5a8acc19b2b640b85b343e383fcf88eb49630ee1835401\": container with ID starting with a580627a8481edd42b5a8acc19b2b640b85b343e383fcf88eb49630ee1835401 not found: ID does not exist" containerID="a580627a8481edd42b5a8acc19b2b640b85b343e383fcf88eb49630ee1835401" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.960670 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a580627a8481edd42b5a8acc19b2b640b85b343e383fcf88eb49630ee1835401"} err="failed to get container status \"a580627a8481edd42b5a8acc19b2b640b85b343e383fcf88eb49630ee1835401\": rpc error: code = NotFound desc = could not find container \"a580627a8481edd42b5a8acc19b2b640b85b343e383fcf88eb49630ee1835401\": container with ID starting with a580627a8481edd42b5a8acc19b2b640b85b343e383fcf88eb49630ee1835401 not found: ID does not exist" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.960775 4764 scope.go:117] "RemoveContainer" containerID="d9415d39eeb8b052770796e95cc32879551f0a21208e04cc89fea6e67abda804" Mar 20 15:16:02 crc kubenswrapper[4764]: E0320 15:16:02.961268 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9415d39eeb8b052770796e95cc32879551f0a21208e04cc89fea6e67abda804\": container with ID starting with d9415d39eeb8b052770796e95cc32879551f0a21208e04cc89fea6e67abda804 not found: ID does not exist" containerID="d9415d39eeb8b052770796e95cc32879551f0a21208e04cc89fea6e67abda804" Mar 20 15:16:02 crc kubenswrapper[4764]: I0320 15:16:02.961355 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9415d39eeb8b052770796e95cc32879551f0a21208e04cc89fea6e67abda804"} err="failed to get container status \"d9415d39eeb8b052770796e95cc32879551f0a21208e04cc89fea6e67abda804\": rpc error: code = NotFound desc = could not find container \"d9415d39eeb8b052770796e95cc32879551f0a21208e04cc89fea6e67abda804\": container with ID starting with d9415d39eeb8b052770796e95cc32879551f0a21208e04cc89fea6e67abda804 not found: ID does not exist" Mar 20 15:16:03 crc kubenswrapper[4764]: I0320 15:16:03.136956 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2b77c3b-7b11-4ea8-9127-5b896c183c7e" path="/var/lib/kubelet/pods/d2b77c3b-7b11-4ea8-9127-5b896c183c7e/volumes" Mar 20 15:16:03 crc kubenswrapper[4764]: I0320 15:16:03.847997 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f","Type":"ContainerStarted","Data":"d637c19d7e9d88aa7ebdf804e0e4a42951ec23d610a59b8722ad417092eef82c"} Mar 20 15:16:03 crc kubenswrapper[4764]: I0320 15:16:03.849863 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 15:16:03 crc kubenswrapper[4764]: I0320 15:16:03.879615 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.879598669 podStartE2EDuration="36.879598669s" podCreationTimestamp="2026-03-20 15:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:16:03.876196415 +0000 UTC m=+1485.492385544" watchObservedRunningTime="2026-03-20 15:16:03.879598669 +0000 UTC m=+1485.495787798" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.180891 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566996-cz9h9" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.271187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkzsc\" (UniqueName: \"kubernetes.io/projected/08d3f786-2402-4bee-ba83-c97c9c8b2209-kube-api-access-xkzsc\") pod \"08d3f786-2402-4bee-ba83-c97c9c8b2209\" (UID: \"08d3f786-2402-4bee-ba83-c97c9c8b2209\") " Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.289164 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d3f786-2402-4bee-ba83-c97c9c8b2209-kube-api-access-xkzsc" (OuterVolumeSpecName: "kube-api-access-xkzsc") pod "08d3f786-2402-4bee-ba83-c97c9c8b2209" (UID: "08d3f786-2402-4bee-ba83-c97c9c8b2209"). InnerVolumeSpecName "kube-api-access-xkzsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.373827 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkzsc\" (UniqueName: \"kubernetes.io/projected/08d3f786-2402-4bee-ba83-c97c9c8b2209-kube-api-access-xkzsc\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.510136 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b"] Mar 20 15:16:04 crc kubenswrapper[4764]: E0320 15:16:04.510591 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b77c3b-7b11-4ea8-9127-5b896c183c7e" containerName="extract-content" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.510612 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b77c3b-7b11-4ea8-9127-5b896c183c7e" containerName="extract-content" Mar 20 15:16:04 crc kubenswrapper[4764]: E0320 15:16:04.510640 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d3f786-2402-4bee-ba83-c97c9c8b2209" containerName="oc" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.510650 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d3f786-2402-4bee-ba83-c97c9c8b2209" containerName="oc" Mar 20 15:16:04 crc kubenswrapper[4764]: E0320 15:16:04.510675 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b77c3b-7b11-4ea8-9127-5b896c183c7e" containerName="extract-utilities" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.510684 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b77c3b-7b11-4ea8-9127-5b896c183c7e" containerName="extract-utilities" Mar 20 15:16:04 crc kubenswrapper[4764]: E0320 15:16:04.510696 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b77c3b-7b11-4ea8-9127-5b896c183c7e" containerName="registry-server" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.510704 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b77c3b-7b11-4ea8-9127-5b896c183c7e" containerName="registry-server" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.510930 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b77c3b-7b11-4ea8-9127-5b896c183c7e" containerName="registry-server" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.510952 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d3f786-2402-4bee-ba83-c97c9c8b2209" containerName="oc" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.511757 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.518269 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.518487 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.518644 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.518822 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.522792 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b"] Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.680107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h45fc\" (UniqueName: \"kubernetes.io/projected/2bc8745c-8628-48f5-9562-1de3f0c30286-kube-api-access-h45fc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.680185 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.680227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.680286 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.781934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h45fc\" (UniqueName: \"kubernetes.io/projected/2bc8745c-8628-48f5-9562-1de3f0c30286-kube-api-access-h45fc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.782269 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.782300 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.782340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.787028 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.787563 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.797612 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.798171 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h45fc\" (UniqueName: \"kubernetes.io/projected/2bc8745c-8628-48f5-9562-1de3f0c30286-kube-api-access-h45fc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.844874 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.863704 4764 generic.go:334] "Generic (PLEG): container finished" podID="eddb3def-0cd3-4d16-954a-dff2909e681f" containerID="1de0ffee11390bacd5f6c8504842d2424a2fd87872e5c3840710acba77b69daa" exitCode=0 Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.863822 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eddb3def-0cd3-4d16-954a-dff2909e681f","Type":"ContainerDied","Data":"1de0ffee11390bacd5f6c8504842d2424a2fd87872e5c3840710acba77b69daa"} Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.869186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566996-cz9h9" event={"ID":"08d3f786-2402-4bee-ba83-c97c9c8b2209","Type":"ContainerDied","Data":"5ef465a26d6bb2f2ae0cde4a5c195f3c52ce3b58d1168f1a4fae47d9df6e228a"} Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.869253 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ef465a26d6bb2f2ae0cde4a5c195f3c52ce3b58d1168f1a4fae47d9df6e228a" Mar 20 15:16:04 crc kubenswrapper[4764]: I0320 15:16:04.869279 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566996-cz9h9" Mar 20 15:16:05 crc kubenswrapper[4764]: I0320 15:16:05.246571 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566990-w9x6v"] Mar 20 15:16:05 crc kubenswrapper[4764]: I0320 15:16:05.254056 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566990-w9x6v"] Mar 20 15:16:05 crc kubenswrapper[4764]: I0320 15:16:05.378923 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b"] Mar 20 15:16:05 crc kubenswrapper[4764]: I0320 15:16:05.878936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" event={"ID":"2bc8745c-8628-48f5-9562-1de3f0c30286","Type":"ContainerStarted","Data":"db36758245cd7b92eaec32f4acdc2475ef0dcbe81c67434c83ce791c6cb1f12f"} Mar 20 15:16:05 crc kubenswrapper[4764]: I0320 15:16:05.881148 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"eddb3def-0cd3-4d16-954a-dff2909e681f","Type":"ContainerStarted","Data":"0a661317d58bf2c254a07c7d721a16c45ca591025eb2681aed47a32f9bf403fc"} Mar 20 15:16:05 crc kubenswrapper[4764]: I0320 15:16:05.882778 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:16:05 crc kubenswrapper[4764]: I0320 15:16:05.930108 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.930086072 podStartE2EDuration="37.930086072s" podCreationTimestamp="2026-03-20 15:15:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:16:05.916881979 +0000 UTC m=+1487.533071108" watchObservedRunningTime="2026-03-20 15:16:05.930086072 +0000 UTC m=+1487.546275201" Mar 20 15:16:07 crc kubenswrapper[4764]: I0320 15:16:07.138997 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a29f5ab4-c8db-4425-91e8-758e08b9caa8" path="/var/lib/kubelet/pods/a29f5ab4-c8db-4425-91e8-758e08b9caa8/volumes" Mar 20 15:16:08 crc kubenswrapper[4764]: I0320 15:16:08.168220 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jsp8s" podUID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerName="registry-server" probeResult="failure" output=< Mar 20 15:16:08 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 20 15:16:08 crc kubenswrapper[4764]: > Mar 20 15:16:10 crc kubenswrapper[4764]: I0320 15:16:10.779742 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:16:10 crc kubenswrapper[4764]: I0320 15:16:10.832167 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zhfsz"] Mar 20 15:16:10 crc kubenswrapper[4764]: I0320 15:16:10.944285 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zhfsz" podUID="ebb751f5-9184-4f6f-8260-663248c52af3" containerName="registry-server" containerID="cri-o://19fcb6c00659629ba95fc38c65aba47d9a992f6d9e18653a1c93299a50ac32f8" gracePeriod=2 Mar 20 15:16:11 crc kubenswrapper[4764]: I0320 15:16:11.955617 4764 generic.go:334] "Generic (PLEG): container finished" podID="ebb751f5-9184-4f6f-8260-663248c52af3" containerID="19fcb6c00659629ba95fc38c65aba47d9a992f6d9e18653a1c93299a50ac32f8" exitCode=0 Mar 20 15:16:11 crc kubenswrapper[4764]: I0320 15:16:11.955690 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhfsz" event={"ID":"ebb751f5-9184-4f6f-8260-663248c52af3","Type":"ContainerDied","Data":"19fcb6c00659629ba95fc38c65aba47d9a992f6d9e18653a1c93299a50ac32f8"} Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.299783 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.364840 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb751f5-9184-4f6f-8260-663248c52af3-catalog-content\") pod \"ebb751f5-9184-4f6f-8260-663248c52af3\" (UID: \"ebb751f5-9184-4f6f-8260-663248c52af3\") " Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.364905 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rjz7\" (UniqueName: \"kubernetes.io/projected/ebb751f5-9184-4f6f-8260-663248c52af3-kube-api-access-6rjz7\") pod \"ebb751f5-9184-4f6f-8260-663248c52af3\" (UID: \"ebb751f5-9184-4f6f-8260-663248c52af3\") " Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.365076 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb751f5-9184-4f6f-8260-663248c52af3-utilities\") pod \"ebb751f5-9184-4f6f-8260-663248c52af3\" (UID: \"ebb751f5-9184-4f6f-8260-663248c52af3\") " Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.366933 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb751f5-9184-4f6f-8260-663248c52af3-utilities" (OuterVolumeSpecName: "utilities") pod "ebb751f5-9184-4f6f-8260-663248c52af3" (UID: "ebb751f5-9184-4f6f-8260-663248c52af3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.371253 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb751f5-9184-4f6f-8260-663248c52af3-kube-api-access-6rjz7" (OuterVolumeSpecName: "kube-api-access-6rjz7") pod "ebb751f5-9184-4f6f-8260-663248c52af3" (UID: "ebb751f5-9184-4f6f-8260-663248c52af3"). InnerVolumeSpecName "kube-api-access-6rjz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.416438 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb751f5-9184-4f6f-8260-663248c52af3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebb751f5-9184-4f6f-8260-663248c52af3" (UID: "ebb751f5-9184-4f6f-8260-663248c52af3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.467475 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rjz7\" (UniqueName: \"kubernetes.io/projected/ebb751f5-9184-4f6f-8260-663248c52af3-kube-api-access-6rjz7\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.467717 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb751f5-9184-4f6f-8260-663248c52af3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.467811 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb751f5-9184-4f6f-8260-663248c52af3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.994705 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhfsz" event={"ID":"ebb751f5-9184-4f6f-8260-663248c52af3","Type":"ContainerDied","Data":"09c5be3ef7350fecb40e3f7e499beac8e6e0bb30522b9edd13e18568239af078"} Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.994760 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhfsz" Mar 20 15:16:14 crc kubenswrapper[4764]: I0320 15:16:14.994921 4764 scope.go:117] "RemoveContainer" containerID="19fcb6c00659629ba95fc38c65aba47d9a992f6d9e18653a1c93299a50ac32f8" Mar 20 15:16:15 crc kubenswrapper[4764]: I0320 15:16:15.006928 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" event={"ID":"2bc8745c-8628-48f5-9562-1de3f0c30286","Type":"ContainerStarted","Data":"99f7d3ea5df0108dc1bc6de1a42100e95d1dcf96f9060ee012138bc96b51b2d6"} Mar 20 15:16:15 crc kubenswrapper[4764]: I0320 15:16:15.026574 4764 scope.go:117] "RemoveContainer" containerID="b9c4ae3ca846ce2f299fec62e6e1f36b8f93dbe87fe24fff26098cf637c530d3" Mar 20 15:16:15 crc kubenswrapper[4764]: I0320 15:16:15.038251 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" podStartSLOduration=2.379029555 podStartE2EDuration="11.038214779s" podCreationTimestamp="2026-03-20 15:16:04 +0000 UTC" firstStartedPulling="2026-03-20 15:16:05.382303083 +0000 UTC m=+1486.998492212" lastFinishedPulling="2026-03-20 15:16:14.041488307 +0000 UTC m=+1495.657677436" observedRunningTime="2026-03-20 15:16:15.024320976 +0000 UTC m=+1496.640510115" watchObservedRunningTime="2026-03-20 15:16:15.038214779 +0000 UTC m=+1496.654403908" Mar 20 15:16:15 crc kubenswrapper[4764]: I0320 15:16:15.051522 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zhfsz"] Mar 20 15:16:15 crc kubenswrapper[4764]: I0320 15:16:15.066075 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zhfsz"] Mar 20 15:16:15 crc kubenswrapper[4764]: I0320 15:16:15.069473 4764 scope.go:117] "RemoveContainer" containerID="420c91b0a587d0ff0d77cc651672cbcedba9bbe9ad31483d707b960fc346affd" Mar 20 15:16:15 crc kubenswrapper[4764]: I0320 15:16:15.139516 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebb751f5-9184-4f6f-8260-663248c52af3" path="/var/lib/kubelet/pods/ebb751f5-9184-4f6f-8260-663248c52af3/volumes" Mar 20 15:16:18 crc kubenswrapper[4764]: I0320 15:16:18.083751 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 15:16:18 crc kubenswrapper[4764]: I0320 15:16:18.170458 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jsp8s" podUID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerName="registry-server" probeResult="failure" output=< Mar 20 15:16:18 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 20 15:16:18 crc kubenswrapper[4764]: > Mar 20 15:16:19 crc kubenswrapper[4764]: I0320 15:16:19.373538 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 15:16:25 crc kubenswrapper[4764]: I0320 15:16:25.133643 4764 generic.go:334] "Generic (PLEG): container finished" podID="2bc8745c-8628-48f5-9562-1de3f0c30286" containerID="99f7d3ea5df0108dc1bc6de1a42100e95d1dcf96f9060ee012138bc96b51b2d6" exitCode=0 Mar 20 15:16:25 crc kubenswrapper[4764]: I0320 15:16:25.138186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" event={"ID":"2bc8745c-8628-48f5-9562-1de3f0c30286","Type":"ContainerDied","Data":"99f7d3ea5df0108dc1bc6de1a42100e95d1dcf96f9060ee012138bc96b51b2d6"} Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.701568 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.839939 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-inventory\") pod \"2bc8745c-8628-48f5-9562-1de3f0c30286\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.840071 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-repo-setup-combined-ca-bundle\") pod \"2bc8745c-8628-48f5-9562-1de3f0c30286\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.840162 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h45fc\" (UniqueName: \"kubernetes.io/projected/2bc8745c-8628-48f5-9562-1de3f0c30286-kube-api-access-h45fc\") pod \"2bc8745c-8628-48f5-9562-1de3f0c30286\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.840220 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-ssh-key-openstack-edpm-ipam\") pod \"2bc8745c-8628-48f5-9562-1de3f0c30286\" (UID: \"2bc8745c-8628-48f5-9562-1de3f0c30286\") " Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.845881 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bc8745c-8628-48f5-9562-1de3f0c30286-kube-api-access-h45fc" (OuterVolumeSpecName: "kube-api-access-h45fc") pod "2bc8745c-8628-48f5-9562-1de3f0c30286" (UID: "2bc8745c-8628-48f5-9562-1de3f0c30286"). InnerVolumeSpecName "kube-api-access-h45fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.848950 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2bc8745c-8628-48f5-9562-1de3f0c30286" (UID: "2bc8745c-8628-48f5-9562-1de3f0c30286"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.886309 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-inventory" (OuterVolumeSpecName: "inventory") pod "2bc8745c-8628-48f5-9562-1de3f0c30286" (UID: "2bc8745c-8628-48f5-9562-1de3f0c30286"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.893601 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2bc8745c-8628-48f5-9562-1de3f0c30286" (UID: "2bc8745c-8628-48f5-9562-1de3f0c30286"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.943425 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h45fc\" (UniqueName: \"kubernetes.io/projected/2bc8745c-8628-48f5-9562-1de3f0c30286-kube-api-access-h45fc\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.943473 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.943493 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:26 crc kubenswrapper[4764]: I0320 15:16:26.943511 4764 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bc8745c-8628-48f5-9562-1de3f0c30286-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.162668 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" event={"ID":"2bc8745c-8628-48f5-9562-1de3f0c30286","Type":"ContainerDied","Data":"db36758245cd7b92eaec32f4acdc2475ef0dcbe81c67434c83ce791c6cb1f12f"} Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.162708 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db36758245cd7b92eaec32f4acdc2475ef0dcbe81c67434c83ce791c6cb1f12f" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.162738 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.187585 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.245498 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.284339 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5"] Mar 20 15:16:27 crc kubenswrapper[4764]: E0320 15:16:27.284892 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb751f5-9184-4f6f-8260-663248c52af3" containerName="extract-content" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.284917 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb751f5-9184-4f6f-8260-663248c52af3" containerName="extract-content" Mar 20 15:16:27 crc kubenswrapper[4764]: E0320 15:16:27.284939 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb751f5-9184-4f6f-8260-663248c52af3" containerName="registry-server" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.284948 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb751f5-9184-4f6f-8260-663248c52af3" containerName="registry-server" Mar 20 15:16:27 crc kubenswrapper[4764]: E0320 15:16:27.284993 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb751f5-9184-4f6f-8260-663248c52af3" containerName="extract-utilities" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.285002 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb751f5-9184-4f6f-8260-663248c52af3" containerName="extract-utilities" Mar 20 15:16:27 crc kubenswrapper[4764]: E0320 15:16:27.285024 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bc8745c-8628-48f5-9562-1de3f0c30286" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.285034 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bc8745c-8628-48f5-9562-1de3f0c30286" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.285256 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebb751f5-9184-4f6f-8260-663248c52af3" containerName="registry-server" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.285294 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bc8745c-8628-48f5-9562-1de3f0c30286" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.286126 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.289621 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.289769 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.289999 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.290154 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.311703 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5"] Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.351827 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7n5g5\" (UID: \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.351890 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7n5g5\" (UID: \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.351946 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2bg\" (UniqueName: \"kubernetes.io/projected/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-kube-api-access-xz2bg\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7n5g5\" (UID: \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.436915 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jsp8s"] Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.454085 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2bg\" (UniqueName: \"kubernetes.io/projected/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-kube-api-access-xz2bg\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7n5g5\" (UID: \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.454341 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7n5g5\" (UID: \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.454429 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7n5g5\" (UID: \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.459805 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7n5g5\" (UID: \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.462298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7n5g5\" (UID: \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.487852 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2bg\" (UniqueName: \"kubernetes.io/projected/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-kube-api-access-xz2bg\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7n5g5\" (UID: \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:27 crc kubenswrapper[4764]: I0320 15:16:27.606780 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:28 crc kubenswrapper[4764]: I0320 15:16:28.239815 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5"] Mar 20 15:16:28 crc kubenswrapper[4764]: W0320 15:16:28.242611 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09e73d7f_44ca_4b1f_bf4c_aa4793441e30.slice/crio-7db0666a20f8d81523ac06c633a295bce263afeaa40af82f9626089e610b1062 WatchSource:0}: Error finding container 7db0666a20f8d81523ac06c633a295bce263afeaa40af82f9626089e610b1062: Status 404 returned error can't find the container with id 7db0666a20f8d81523ac06c633a295bce263afeaa40af82f9626089e610b1062 Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.184347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" event={"ID":"09e73d7f-44ca-4b1f-bf4c-aa4793441e30","Type":"ContainerStarted","Data":"90b55de1a9ed89a44afec80b105acfc00c55e2ad7e50e13920d09b04ae611475"} Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.184786 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" event={"ID":"09e73d7f-44ca-4b1f-bf4c-aa4793441e30","Type":"ContainerStarted","Data":"7db0666a20f8d81523ac06c633a295bce263afeaa40af82f9626089e610b1062"} Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.184899 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jsp8s" podUID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerName="registry-server" containerID="cri-o://de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79" gracePeriod=2 Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.204088 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" podStartSLOduration=1.6794148660000001 podStartE2EDuration="2.204045868s" podCreationTimestamp="2026-03-20 15:16:27 +0000 UTC" firstStartedPulling="2026-03-20 15:16:28.24649232 +0000 UTC m=+1509.862681489" lastFinishedPulling="2026-03-20 15:16:28.771123352 +0000 UTC m=+1510.387312491" observedRunningTime="2026-03-20 15:16:29.203182982 +0000 UTC m=+1510.819372141" watchObservedRunningTime="2026-03-20 15:16:29.204045868 +0000 UTC m=+1510.820235017" Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.692655 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.797707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-utilities\") pod \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\" (UID: \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\") " Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.797768 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf56f\" (UniqueName: \"kubernetes.io/projected/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-kube-api-access-gf56f\") pod \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\" (UID: \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\") " Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.797792 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-catalog-content\") pod \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\" (UID: \"cc4bf2b3-90f1-4faf-8e98-16004adc36ef\") " Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.799178 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-utilities" (OuterVolumeSpecName: "utilities") pod "cc4bf2b3-90f1-4faf-8e98-16004adc36ef" (UID: "cc4bf2b3-90f1-4faf-8e98-16004adc36ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.804333 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-kube-api-access-gf56f" (OuterVolumeSpecName: "kube-api-access-gf56f") pod "cc4bf2b3-90f1-4faf-8e98-16004adc36ef" (UID: "cc4bf2b3-90f1-4faf-8e98-16004adc36ef"). InnerVolumeSpecName "kube-api-access-gf56f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.900972 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.901006 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf56f\" (UniqueName: \"kubernetes.io/projected/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-kube-api-access-gf56f\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:29 crc kubenswrapper[4764]: I0320 15:16:29.932404 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc4bf2b3-90f1-4faf-8e98-16004adc36ef" (UID: "cc4bf2b3-90f1-4faf-8e98-16004adc36ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.002527 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc4bf2b3-90f1-4faf-8e98-16004adc36ef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.199172 4764 generic.go:334] "Generic (PLEG): container finished" podID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerID="de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79" exitCode=0 Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.199291 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jsp8s" Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.199280 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsp8s" event={"ID":"cc4bf2b3-90f1-4faf-8e98-16004adc36ef","Type":"ContainerDied","Data":"de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79"} Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.199356 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsp8s" event={"ID":"cc4bf2b3-90f1-4faf-8e98-16004adc36ef","Type":"ContainerDied","Data":"78f79add4ed7bcbe87e1f78abe611c2fbf60578668eb0123a78e9464ce311c90"} Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.199408 4764 scope.go:117] "RemoveContainer" containerID="de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79" Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.242543 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jsp8s"] Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.253301 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jsp8s"] Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.254144 4764 scope.go:117] "RemoveContainer" containerID="88b6c22efa7fc0ccb5772b578bdfdd6546ddbcfe3a3be1b1a5f3ecb74641465e" Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.278351 4764 scope.go:117] "RemoveContainer" containerID="2a3be1b6376743ccc9f6f8e20640bcc7acd2c34568522e27ff73ff9b0f1c5932" Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.316339 4764 scope.go:117] "RemoveContainer" containerID="de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79" Mar 20 15:16:30 crc kubenswrapper[4764]: E0320 15:16:30.317023 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79\": container with ID starting with de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79 not found: ID does not exist" containerID="de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79" Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.317079 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79"} err="failed to get container status \"de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79\": rpc error: code = NotFound desc = could not find container \"de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79\": container with ID starting with de0a67d65cfd8cd903be40fd562542f5e6c9f6b062f349f72f6bee17783b0c79 not found: ID does not exist" Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.317112 4764 scope.go:117] "RemoveContainer" containerID="88b6c22efa7fc0ccb5772b578bdfdd6546ddbcfe3a3be1b1a5f3ecb74641465e" Mar 20 15:16:30 crc kubenswrapper[4764]: E0320 15:16:30.317427 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b6c22efa7fc0ccb5772b578bdfdd6546ddbcfe3a3be1b1a5f3ecb74641465e\": container with ID starting with 88b6c22efa7fc0ccb5772b578bdfdd6546ddbcfe3a3be1b1a5f3ecb74641465e not found: ID does not exist" containerID="88b6c22efa7fc0ccb5772b578bdfdd6546ddbcfe3a3be1b1a5f3ecb74641465e" Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.317458 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b6c22efa7fc0ccb5772b578bdfdd6546ddbcfe3a3be1b1a5f3ecb74641465e"} err="failed to get container status \"88b6c22efa7fc0ccb5772b578bdfdd6546ddbcfe3a3be1b1a5f3ecb74641465e\": rpc error: code = NotFound desc = could not find container \"88b6c22efa7fc0ccb5772b578bdfdd6546ddbcfe3a3be1b1a5f3ecb74641465e\": container with ID starting with 88b6c22efa7fc0ccb5772b578bdfdd6546ddbcfe3a3be1b1a5f3ecb74641465e not found: ID does not exist" Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.317480 4764 scope.go:117] "RemoveContainer" containerID="2a3be1b6376743ccc9f6f8e20640bcc7acd2c34568522e27ff73ff9b0f1c5932" Mar 20 15:16:30 crc kubenswrapper[4764]: E0320 15:16:30.317721 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3be1b6376743ccc9f6f8e20640bcc7acd2c34568522e27ff73ff9b0f1c5932\": container with ID starting with 2a3be1b6376743ccc9f6f8e20640bcc7acd2c34568522e27ff73ff9b0f1c5932 not found: ID does not exist" containerID="2a3be1b6376743ccc9f6f8e20640bcc7acd2c34568522e27ff73ff9b0f1c5932" Mar 20 15:16:30 crc kubenswrapper[4764]: I0320 15:16:30.317773 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3be1b6376743ccc9f6f8e20640bcc7acd2c34568522e27ff73ff9b0f1c5932"} err="failed to get container status \"2a3be1b6376743ccc9f6f8e20640bcc7acd2c34568522e27ff73ff9b0f1c5932\": rpc error: code = NotFound desc = could not find container \"2a3be1b6376743ccc9f6f8e20640bcc7acd2c34568522e27ff73ff9b0f1c5932\": container with ID starting with 2a3be1b6376743ccc9f6f8e20640bcc7acd2c34568522e27ff73ff9b0f1c5932 not found: ID does not exist" Mar 20 15:16:31 crc kubenswrapper[4764]: I0320 15:16:31.138198 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" path="/var/lib/kubelet/pods/cc4bf2b3-90f1-4faf-8e98-16004adc36ef/volumes" Mar 20 15:16:32 crc kubenswrapper[4764]: I0320 15:16:32.216924 4764 generic.go:334] "Generic (PLEG): container finished" podID="09e73d7f-44ca-4b1f-bf4c-aa4793441e30" containerID="90b55de1a9ed89a44afec80b105acfc00c55e2ad7e50e13920d09b04ae611475" exitCode=0 Mar 20 15:16:32 crc kubenswrapper[4764]: I0320 15:16:32.217017 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" event={"ID":"09e73d7f-44ca-4b1f-bf4c-aa4793441e30","Type":"ContainerDied","Data":"90b55de1a9ed89a44afec80b105acfc00c55e2ad7e50e13920d09b04ae611475"} Mar 20 15:16:33 crc kubenswrapper[4764]: I0320 15:16:33.674304 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:33 crc kubenswrapper[4764]: I0320 15:16:33.786728 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-inventory\") pod \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\" (UID: \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\") " Mar 20 15:16:33 crc kubenswrapper[4764]: I0320 15:16:33.786829 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-ssh-key-openstack-edpm-ipam\") pod \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\" (UID: \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\") " Mar 20 15:16:33 crc kubenswrapper[4764]: I0320 15:16:33.786872 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz2bg\" (UniqueName: \"kubernetes.io/projected/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-kube-api-access-xz2bg\") pod \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\" (UID: \"09e73d7f-44ca-4b1f-bf4c-aa4793441e30\") " Mar 20 15:16:33 crc kubenswrapper[4764]: I0320 15:16:33.797695 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-kube-api-access-xz2bg" (OuterVolumeSpecName: "kube-api-access-xz2bg") pod "09e73d7f-44ca-4b1f-bf4c-aa4793441e30" (UID: "09e73d7f-44ca-4b1f-bf4c-aa4793441e30"). InnerVolumeSpecName "kube-api-access-xz2bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:16:33 crc kubenswrapper[4764]: I0320 15:16:33.823301 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-inventory" (OuterVolumeSpecName: "inventory") pod "09e73d7f-44ca-4b1f-bf4c-aa4793441e30" (UID: "09e73d7f-44ca-4b1f-bf4c-aa4793441e30"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:16:33 crc kubenswrapper[4764]: I0320 15:16:33.836268 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "09e73d7f-44ca-4b1f-bf4c-aa4793441e30" (UID: "09e73d7f-44ca-4b1f-bf4c-aa4793441e30"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:16:33 crc kubenswrapper[4764]: I0320 15:16:33.890035 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:33 crc kubenswrapper[4764]: I0320 15:16:33.890080 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz2bg\" (UniqueName: \"kubernetes.io/projected/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-kube-api-access-xz2bg\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:33 crc kubenswrapper[4764]: I0320 15:16:33.890093 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09e73d7f-44ca-4b1f-bf4c-aa4793441e30-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.242921 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" event={"ID":"09e73d7f-44ca-4b1f-bf4c-aa4793441e30","Type":"ContainerDied","Data":"7db0666a20f8d81523ac06c633a295bce263afeaa40af82f9626089e610b1062"} Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.242983 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7db0666a20f8d81523ac06c633a295bce263afeaa40af82f9626089e610b1062" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.243042 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7n5g5" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.308549 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4"] Mar 20 15:16:34 crc kubenswrapper[4764]: E0320 15:16:34.308960 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerName="extract-content" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.308980 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerName="extract-content" Mar 20 15:16:34 crc kubenswrapper[4764]: E0320 15:16:34.309017 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerName="extract-utilities" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.309027 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerName="extract-utilities" Mar 20 15:16:34 crc kubenswrapper[4764]: E0320 15:16:34.309045 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e73d7f-44ca-4b1f-bf4c-aa4793441e30" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.309055 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e73d7f-44ca-4b1f-bf4c-aa4793441e30" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 15:16:34 crc kubenswrapper[4764]: E0320 15:16:34.309075 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerName="registry-server" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.309083 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerName="registry-server" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.309286 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc4bf2b3-90f1-4faf-8e98-16004adc36ef" containerName="registry-server" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.309305 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="09e73d7f-44ca-4b1f-bf4c-aa4793441e30" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.310012 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.312736 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.313025 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.313190 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.313559 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.319851 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4"] Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.399297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.399426 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.399485 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtcxd\" (UniqueName: \"kubernetes.io/projected/db572961-158b-4953-aaf1-af5b9e940592-kube-api-access-rtcxd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.399545 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.500740 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.500904 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.501020 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtcxd\" (UniqueName: \"kubernetes.io/projected/db572961-158b-4953-aaf1-af5b9e940592-kube-api-access-rtcxd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.501117 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.506282 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.506414 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.507857 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.532329 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtcxd\" (UniqueName: \"kubernetes.io/projected/db572961-158b-4953-aaf1-af5b9e940592-kube-api-access-rtcxd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:34 crc kubenswrapper[4764]: I0320 15:16:34.644084 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:16:35 crc kubenswrapper[4764]: I0320 15:16:35.233418 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4"] Mar 20 15:16:35 crc kubenswrapper[4764]: W0320 15:16:35.248728 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb572961_158b_4953_aaf1_af5b9e940592.slice/crio-553f5833ead6d73ff813d74b565370b52eaf108fc6903cc2b8ce430085634cf4 WatchSource:0}: Error finding container 553f5833ead6d73ff813d74b565370b52eaf108fc6903cc2b8ce430085634cf4: Status 404 returned error can't find the container with id 553f5833ead6d73ff813d74b565370b52eaf108fc6903cc2b8ce430085634cf4 Mar 20 15:16:36 crc kubenswrapper[4764]: I0320 15:16:36.262320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" event={"ID":"db572961-158b-4953-aaf1-af5b9e940592","Type":"ContainerStarted","Data":"98547464d854e437176e290413799ccdca2da9a404394bdff10c13cf4dd89ca1"} Mar 20 15:16:36 crc kubenswrapper[4764]: I0320 15:16:36.263227 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" event={"ID":"db572961-158b-4953-aaf1-af5b9e940592","Type":"ContainerStarted","Data":"553f5833ead6d73ff813d74b565370b52eaf108fc6903cc2b8ce430085634cf4"} Mar 20 15:16:36 crc kubenswrapper[4764]: I0320 15:16:36.282451 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" podStartSLOduration=1.680478736 podStartE2EDuration="2.282430945s" podCreationTimestamp="2026-03-20 15:16:34 +0000 UTC" firstStartedPulling="2026-03-20 15:16:35.25349539 +0000 UTC m=+1516.869684529" lastFinishedPulling="2026-03-20 15:16:35.855447569 +0000 UTC m=+1517.471636738" observedRunningTime="2026-03-20 15:16:36.278324079 +0000 UTC m=+1517.894513228" watchObservedRunningTime="2026-03-20 15:16:36.282430945 +0000 UTC m=+1517.898620074" Mar 20 15:16:38 crc kubenswrapper[4764]: I0320 15:16:38.443905 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:16:38 crc kubenswrapper[4764]: I0320 15:16:38.444410 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:17:06 crc kubenswrapper[4764]: I0320 15:17:06.604640 4764 scope.go:117] "RemoveContainer" containerID="e8310b6a04844394537da70e1c56f7dddc0b9212f6bf023da59cd0327362d61a" Mar 20 15:17:06 crc kubenswrapper[4764]: I0320 15:17:06.656726 4764 scope.go:117] "RemoveContainer" containerID="a39624500902aecf9b1f4fab12c4f21d0c0ba21c979c01022d316a34f7c2c00d" Mar 20 15:17:06 crc kubenswrapper[4764]: I0320 15:17:06.693507 4764 scope.go:117] "RemoveContainer" containerID="1c8381cc390c8c464e734160bcb307849ba72cceaf08235c770d7cfde7ce8aaf" Mar 20 15:17:06 crc kubenswrapper[4764]: I0320 15:17:06.741922 4764 scope.go:117] "RemoveContainer" containerID="f830207a2cea7607a6a7a4beb77d97480a3736866a04fabfe3c4c06b2d5ff701" Mar 20 15:17:08 crc kubenswrapper[4764]: I0320 15:17:08.443659 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:17:08 crc kubenswrapper[4764]: I0320 15:17:08.444098 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:17:38 crc kubenswrapper[4764]: I0320 15:17:38.443434 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:17:38 crc kubenswrapper[4764]: I0320 15:17:38.443990 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:17:38 crc kubenswrapper[4764]: I0320 15:17:38.444044 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 15:17:38 crc kubenswrapper[4764]: I0320 15:17:38.445216 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:17:38 crc kubenswrapper[4764]: I0320 15:17:38.445303 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" gracePeriod=600 Mar 20 15:17:38 crc kubenswrapper[4764]: E0320 15:17:38.670486 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:17:39 crc kubenswrapper[4764]: I0320 15:17:39.242707 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" exitCode=0 Mar 20 15:17:39 crc kubenswrapper[4764]: I0320 15:17:39.242768 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be"} Mar 20 15:17:39 crc kubenswrapper[4764]: I0320 15:17:39.243131 4764 scope.go:117] "RemoveContainer" containerID="12d0a96258b093aee4f40f6af8a6aca80a1ed347e605a2693dc0a396877cb9c2" Mar 20 15:17:39 crc kubenswrapper[4764]: I0320 15:17:39.243827 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:17:39 crc kubenswrapper[4764]: E0320 15:17:39.244148 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:17:53 crc kubenswrapper[4764]: I0320 15:17:53.126908 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:17:53 crc kubenswrapper[4764]: E0320 15:17:53.127976 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:18:00 crc kubenswrapper[4764]: I0320 15:18:00.155544 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566998-669xr"] Mar 20 15:18:00 crc kubenswrapper[4764]: I0320 15:18:00.158055 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566998-669xr" Mar 20 15:18:00 crc kubenswrapper[4764]: I0320 15:18:00.160436 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:18:00 crc kubenswrapper[4764]: I0320 15:18:00.165342 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:18:00 crc kubenswrapper[4764]: I0320 15:18:00.165359 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:18:00 crc kubenswrapper[4764]: I0320 15:18:00.193927 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566998-669xr"] Mar 20 15:18:00 crc kubenswrapper[4764]: I0320 15:18:00.240765 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw2lc\" (UniqueName: \"kubernetes.io/projected/67727519-47f8-4167-a718-c140a71b2dae-kube-api-access-pw2lc\") pod \"auto-csr-approver-29566998-669xr\" (UID: \"67727519-47f8-4167-a718-c140a71b2dae\") " pod="openshift-infra/auto-csr-approver-29566998-669xr" Mar 20 15:18:00 crc kubenswrapper[4764]: I0320 15:18:00.343281 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw2lc\" (UniqueName: \"kubernetes.io/projected/67727519-47f8-4167-a718-c140a71b2dae-kube-api-access-pw2lc\") pod \"auto-csr-approver-29566998-669xr\" (UID: \"67727519-47f8-4167-a718-c140a71b2dae\") " pod="openshift-infra/auto-csr-approver-29566998-669xr" Mar 20 15:18:00 crc kubenswrapper[4764]: I0320 15:18:00.361181 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw2lc\" (UniqueName: \"kubernetes.io/projected/67727519-47f8-4167-a718-c140a71b2dae-kube-api-access-pw2lc\") pod \"auto-csr-approver-29566998-669xr\" (UID: \"67727519-47f8-4167-a718-c140a71b2dae\") " pod="openshift-infra/auto-csr-approver-29566998-669xr" Mar 20 15:18:00 crc kubenswrapper[4764]: I0320 15:18:00.479050 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566998-669xr" Mar 20 15:18:00 crc kubenswrapper[4764]: I0320 15:18:00.969134 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566998-669xr"] Mar 20 15:18:01 crc kubenswrapper[4764]: I0320 15:18:01.451563 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566998-669xr" event={"ID":"67727519-47f8-4167-a718-c140a71b2dae","Type":"ContainerStarted","Data":"c3de0fc2bdb466437f0854375858a2462ab17c31b024f3bddc75b38b34f10f16"} Mar 20 15:18:02 crc kubenswrapper[4764]: I0320 15:18:02.463468 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566998-669xr" event={"ID":"67727519-47f8-4167-a718-c140a71b2dae","Type":"ContainerStarted","Data":"c01781c56e21fce86204109ad8f7345f0237d0a20ea1aca7504d738fa2299089"} Mar 20 15:18:02 crc kubenswrapper[4764]: I0320 15:18:02.476474 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566998-669xr" podStartSLOduration=1.44388921 podStartE2EDuration="2.476453847s" podCreationTimestamp="2026-03-20 15:18:00 +0000 UTC" firstStartedPulling="2026-03-20 15:18:00.968107708 +0000 UTC m=+1602.584296837" lastFinishedPulling="2026-03-20 15:18:02.000672305 +0000 UTC m=+1603.616861474" observedRunningTime="2026-03-20 15:18:02.474200057 +0000 UTC m=+1604.090389186" watchObservedRunningTime="2026-03-20 15:18:02.476453847 +0000 UTC m=+1604.092642976" Mar 20 15:18:03 crc kubenswrapper[4764]: I0320 15:18:03.477728 4764 generic.go:334] "Generic (PLEG): container finished" podID="67727519-47f8-4167-a718-c140a71b2dae" containerID="c01781c56e21fce86204109ad8f7345f0237d0a20ea1aca7504d738fa2299089" exitCode=0 Mar 20 15:18:03 crc kubenswrapper[4764]: I0320 15:18:03.477779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566998-669xr" event={"ID":"67727519-47f8-4167-a718-c140a71b2dae","Type":"ContainerDied","Data":"c01781c56e21fce86204109ad8f7345f0237d0a20ea1aca7504d738fa2299089"} Mar 20 15:18:04 crc kubenswrapper[4764]: I0320 15:18:04.910419 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566998-669xr" Mar 20 15:18:04 crc kubenswrapper[4764]: I0320 15:18:04.944508 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw2lc\" (UniqueName: \"kubernetes.io/projected/67727519-47f8-4167-a718-c140a71b2dae-kube-api-access-pw2lc\") pod \"67727519-47f8-4167-a718-c140a71b2dae\" (UID: \"67727519-47f8-4167-a718-c140a71b2dae\") " Mar 20 15:18:04 crc kubenswrapper[4764]: I0320 15:18:04.951505 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67727519-47f8-4167-a718-c140a71b2dae-kube-api-access-pw2lc" (OuterVolumeSpecName: "kube-api-access-pw2lc") pod "67727519-47f8-4167-a718-c140a71b2dae" (UID: "67727519-47f8-4167-a718-c140a71b2dae"). InnerVolumeSpecName "kube-api-access-pw2lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:18:05 crc kubenswrapper[4764]: I0320 15:18:05.047087 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw2lc\" (UniqueName: \"kubernetes.io/projected/67727519-47f8-4167-a718-c140a71b2dae-kube-api-access-pw2lc\") on node \"crc\" DevicePath \"\"" Mar 20 15:18:05 crc kubenswrapper[4764]: I0320 15:18:05.498690 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566998-669xr" event={"ID":"67727519-47f8-4167-a718-c140a71b2dae","Type":"ContainerDied","Data":"c3de0fc2bdb466437f0854375858a2462ab17c31b024f3bddc75b38b34f10f16"} Mar 20 15:18:05 crc kubenswrapper[4764]: I0320 15:18:05.498743 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3de0fc2bdb466437f0854375858a2462ab17c31b024f3bddc75b38b34f10f16" Mar 20 15:18:05 crc kubenswrapper[4764]: I0320 15:18:05.498771 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566998-669xr" Mar 20 15:18:05 crc kubenswrapper[4764]: I0320 15:18:05.547706 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566992-6qwzt"] Mar 20 15:18:05 crc kubenswrapper[4764]: I0320 15:18:05.559766 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566992-6qwzt"] Mar 20 15:18:06 crc kubenswrapper[4764]: I0320 15:18:06.913779 4764 scope.go:117] "RemoveContainer" containerID="8dbbb55a76c2179a8117d688f9055882c91cbebaa24df85b6b20ed45136493e3" Mar 20 15:18:07 crc kubenswrapper[4764]: I0320 15:18:07.129872 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:18:07 crc kubenswrapper[4764]: E0320 15:18:07.130080 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:18:07 crc kubenswrapper[4764]: I0320 15:18:07.144679 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a667b00c-fedc-470d-adb4-309d0a96a676" path="/var/lib/kubelet/pods/a667b00c-fedc-470d-adb4-309d0a96a676/volumes" Mar 20 15:18:20 crc kubenswrapper[4764]: I0320 15:18:20.127864 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:18:20 crc kubenswrapper[4764]: E0320 15:18:20.128601 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:18:33 crc kubenswrapper[4764]: I0320 15:18:33.126928 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:18:33 crc kubenswrapper[4764]: E0320 15:18:33.127956 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:18:45 crc kubenswrapper[4764]: I0320 15:18:45.126987 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:18:45 crc kubenswrapper[4764]: E0320 15:18:45.128175 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:18:58 crc kubenswrapper[4764]: I0320 15:18:58.126774 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:18:58 crc kubenswrapper[4764]: E0320 15:18:58.127540 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:19:07 crc kubenswrapper[4764]: I0320 15:19:07.022435 4764 scope.go:117] "RemoveContainer" containerID="59d10a14895dddc75eda73f47e084a6ca150ba609f6efc06eb77693aefa513a8" Mar 20 15:19:07 crc kubenswrapper[4764]: I0320 15:19:07.075303 4764 scope.go:117] "RemoveContainer" containerID="d8e2b25ff1a56bf327c5296603ec2a96bc37d0185283e691c8da4656c7fa90f0" Mar 20 15:19:07 crc kubenswrapper[4764]: I0320 15:19:07.116129 4764 scope.go:117] "RemoveContainer" containerID="bb41983f78feb0ec34c6f2d9b09214cc19ae10c2f196f086cebb67423e5791b1" Mar 20 15:19:12 crc kubenswrapper[4764]: I0320 15:19:12.128282 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:19:12 crc kubenswrapper[4764]: E0320 15:19:12.128968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:19:23 crc kubenswrapper[4764]: I0320 15:19:23.127320 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:19:23 crc kubenswrapper[4764]: E0320 15:19:23.128319 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:19:36 crc kubenswrapper[4764]: I0320 15:19:36.126825 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:19:36 crc kubenswrapper[4764]: E0320 15:19:36.127990 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:19:45 crc kubenswrapper[4764]: I0320 15:19:45.554053 4764 generic.go:334] "Generic (PLEG): container finished" podID="db572961-158b-4953-aaf1-af5b9e940592" containerID="98547464d854e437176e290413799ccdca2da9a404394bdff10c13cf4dd89ca1" exitCode=0 Mar 20 15:19:45 crc kubenswrapper[4764]: I0320 15:19:45.554130 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" event={"ID":"db572961-158b-4953-aaf1-af5b9e940592","Type":"ContainerDied","Data":"98547464d854e437176e290413799ccdca2da9a404394bdff10c13cf4dd89ca1"} Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.005603 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.059074 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-bootstrap-combined-ca-bundle\") pod \"db572961-158b-4953-aaf1-af5b9e940592\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.059210 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-ssh-key-openstack-edpm-ipam\") pod \"db572961-158b-4953-aaf1-af5b9e940592\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.059371 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-inventory\") pod \"db572961-158b-4953-aaf1-af5b9e940592\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.059641 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtcxd\" (UniqueName: \"kubernetes.io/projected/db572961-158b-4953-aaf1-af5b9e940592-kube-api-access-rtcxd\") pod \"db572961-158b-4953-aaf1-af5b9e940592\" (UID: \"db572961-158b-4953-aaf1-af5b9e940592\") " Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.071207 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "db572961-158b-4953-aaf1-af5b9e940592" (UID: "db572961-158b-4953-aaf1-af5b9e940592"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.071350 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db572961-158b-4953-aaf1-af5b9e940592-kube-api-access-rtcxd" (OuterVolumeSpecName: "kube-api-access-rtcxd") pod "db572961-158b-4953-aaf1-af5b9e940592" (UID: "db572961-158b-4953-aaf1-af5b9e940592"). InnerVolumeSpecName "kube-api-access-rtcxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.084529 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-inventory" (OuterVolumeSpecName: "inventory") pod "db572961-158b-4953-aaf1-af5b9e940592" (UID: "db572961-158b-4953-aaf1-af5b9e940592"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.105821 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "db572961-158b-4953-aaf1-af5b9e940592" (UID: "db572961-158b-4953-aaf1-af5b9e940592"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.163412 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.163454 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.163466 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db572961-158b-4953-aaf1-af5b9e940592-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.163475 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtcxd\" (UniqueName: \"kubernetes.io/projected/db572961-158b-4953-aaf1-af5b9e940592-kube-api-access-rtcxd\") on node \"crc\" DevicePath \"\"" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.589856 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" event={"ID":"db572961-158b-4953-aaf1-af5b9e940592","Type":"ContainerDied","Data":"553f5833ead6d73ff813d74b565370b52eaf108fc6903cc2b8ce430085634cf4"} Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.590434 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="553f5833ead6d73ff813d74b565370b52eaf108fc6903cc2b8ce430085634cf4" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.589968 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.688984 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm"] Mar 20 15:19:47 crc kubenswrapper[4764]: E0320 15:19:47.689683 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67727519-47f8-4167-a718-c140a71b2dae" containerName="oc" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.689718 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="67727519-47f8-4167-a718-c140a71b2dae" containerName="oc" Mar 20 15:19:47 crc kubenswrapper[4764]: E0320 15:19:47.689760 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db572961-158b-4953-aaf1-af5b9e940592" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.689774 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="db572961-158b-4953-aaf1-af5b9e940592" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.690090 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="67727519-47f8-4167-a718-c140a71b2dae" containerName="oc" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.690125 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="db572961-158b-4953-aaf1-af5b9e940592" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.691219 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.694483 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.694683 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.694803 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.694971 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.720780 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm"] Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.775676 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48ldv\" (UniqueName: \"kubernetes.io/projected/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-kube-api-access-48ldv\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm\" (UID: \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.775936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm\" (UID: \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.776096 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm\" (UID: \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.877815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm\" (UID: \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.878001 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48ldv\" (UniqueName: \"kubernetes.io/projected/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-kube-api-access-48ldv\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm\" (UID: \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.878033 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm\" (UID: \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.882331 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm\" (UID: \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.883084 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm\" (UID: \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:19:47 crc kubenswrapper[4764]: I0320 15:19:47.899256 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48ldv\" (UniqueName: \"kubernetes.io/projected/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-kube-api-access-48ldv\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm\" (UID: \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:19:48 crc kubenswrapper[4764]: I0320 15:19:48.021401 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:19:48 crc kubenswrapper[4764]: I0320 15:19:48.126229 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:19:48 crc kubenswrapper[4764]: E0320 15:19:48.126558 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:19:48 crc kubenswrapper[4764]: I0320 15:19:48.616828 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm"] Mar 20 15:19:49 crc kubenswrapper[4764]: I0320 15:19:49.611287 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" event={"ID":"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c","Type":"ContainerStarted","Data":"174c16b8e73298820eb413bc9c7a432018b4c95aa36790ad8e2be148a0463543"} Mar 20 15:19:50 crc kubenswrapper[4764]: I0320 15:19:50.625051 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" event={"ID":"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c","Type":"ContainerStarted","Data":"2f61cfbed5baefd2a2a9df9ee588f7666fc5d46d725903304d650edfab7bf00f"} Mar 20 15:19:50 crc kubenswrapper[4764]: I0320 15:19:50.654776 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" podStartSLOduration=2.850534825 podStartE2EDuration="3.654741884s" podCreationTimestamp="2026-03-20 15:19:47 +0000 UTC" firstStartedPulling="2026-03-20 15:19:48.620491902 +0000 UTC m=+1710.236681041" lastFinishedPulling="2026-03-20 15:19:49.424698971 +0000 UTC m=+1711.040888100" observedRunningTime="2026-03-20 15:19:50.644461651 +0000 UTC m=+1712.260650820" watchObservedRunningTime="2026-03-20 15:19:50.654741884 +0000 UTC m=+1712.270931053" Mar 20 15:20:00 crc kubenswrapper[4764]: I0320 15:20:00.144304 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567000-r6rgv"] Mar 20 15:20:00 crc kubenswrapper[4764]: I0320 15:20:00.146020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567000-r6rgv" Mar 20 15:20:00 crc kubenswrapper[4764]: I0320 15:20:00.150680 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:20:00 crc kubenswrapper[4764]: I0320 15:20:00.150957 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:20:00 crc kubenswrapper[4764]: I0320 15:20:00.151115 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:20:00 crc kubenswrapper[4764]: I0320 15:20:00.155685 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567000-r6rgv"] Mar 20 15:20:00 crc kubenswrapper[4764]: I0320 15:20:00.238144 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xvdm\" (UniqueName: \"kubernetes.io/projected/b9668ba6-0faf-40eb-a0a2-9d0557167dda-kube-api-access-8xvdm\") pod \"auto-csr-approver-29567000-r6rgv\" (UID: \"b9668ba6-0faf-40eb-a0a2-9d0557167dda\") " pod="openshift-infra/auto-csr-approver-29567000-r6rgv" Mar 20 15:20:00 crc kubenswrapper[4764]: I0320 15:20:00.341589 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xvdm\" (UniqueName: \"kubernetes.io/projected/b9668ba6-0faf-40eb-a0a2-9d0557167dda-kube-api-access-8xvdm\") pod \"auto-csr-approver-29567000-r6rgv\" (UID: \"b9668ba6-0faf-40eb-a0a2-9d0557167dda\") " pod="openshift-infra/auto-csr-approver-29567000-r6rgv" Mar 20 15:20:00 crc kubenswrapper[4764]: I0320 15:20:00.364206 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xvdm\" (UniqueName: \"kubernetes.io/projected/b9668ba6-0faf-40eb-a0a2-9d0557167dda-kube-api-access-8xvdm\") pod \"auto-csr-approver-29567000-r6rgv\" (UID: \"b9668ba6-0faf-40eb-a0a2-9d0557167dda\") " pod="openshift-infra/auto-csr-approver-29567000-r6rgv" Mar 20 15:20:00 crc kubenswrapper[4764]: I0320 15:20:00.467854 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567000-r6rgv" Mar 20 15:20:00 crc kubenswrapper[4764]: I0320 15:20:00.916583 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567000-r6rgv"] Mar 20 15:20:00 crc kubenswrapper[4764]: W0320 15:20:00.926725 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9668ba6_0faf_40eb_a0a2_9d0557167dda.slice/crio-9517f2653576591d34f5cae2c29df8eb811427f0b6c4722cfd27fef22dfc8dbb WatchSource:0}: Error finding container 9517f2653576591d34f5cae2c29df8eb811427f0b6c4722cfd27fef22dfc8dbb: Status 404 returned error can't find the container with id 9517f2653576591d34f5cae2c29df8eb811427f0b6c4722cfd27fef22dfc8dbb Mar 20 15:20:01 crc kubenswrapper[4764]: I0320 15:20:01.128292 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:20:01 crc kubenswrapper[4764]: E0320 15:20:01.128932 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:20:01 crc kubenswrapper[4764]: I0320 15:20:01.745465 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567000-r6rgv" event={"ID":"b9668ba6-0faf-40eb-a0a2-9d0557167dda","Type":"ContainerStarted","Data":"9517f2653576591d34f5cae2c29df8eb811427f0b6c4722cfd27fef22dfc8dbb"} Mar 20 15:20:02 crc kubenswrapper[4764]: I0320 15:20:02.761320 4764 generic.go:334] "Generic (PLEG): container finished" podID="b9668ba6-0faf-40eb-a0a2-9d0557167dda" containerID="94cd599e827ac5d634a1c4690843f72536f703ae1d9e1e0adaeb1969834c5aaa" exitCode=0 Mar 20 15:20:02 crc kubenswrapper[4764]: I0320 15:20:02.761526 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567000-r6rgv" event={"ID":"b9668ba6-0faf-40eb-a0a2-9d0557167dda","Type":"ContainerDied","Data":"94cd599e827ac5d634a1c4690843f72536f703ae1d9e1e0adaeb1969834c5aaa"} Mar 20 15:20:04 crc kubenswrapper[4764]: I0320 15:20:04.229991 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567000-r6rgv" Mar 20 15:20:04 crc kubenswrapper[4764]: I0320 15:20:04.336566 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xvdm\" (UniqueName: \"kubernetes.io/projected/b9668ba6-0faf-40eb-a0a2-9d0557167dda-kube-api-access-8xvdm\") pod \"b9668ba6-0faf-40eb-a0a2-9d0557167dda\" (UID: \"b9668ba6-0faf-40eb-a0a2-9d0557167dda\") " Mar 20 15:20:04 crc kubenswrapper[4764]: I0320 15:20:04.349887 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9668ba6-0faf-40eb-a0a2-9d0557167dda-kube-api-access-8xvdm" (OuterVolumeSpecName: "kube-api-access-8xvdm") pod "b9668ba6-0faf-40eb-a0a2-9d0557167dda" (UID: "b9668ba6-0faf-40eb-a0a2-9d0557167dda"). InnerVolumeSpecName "kube-api-access-8xvdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:20:04 crc kubenswrapper[4764]: I0320 15:20:04.438913 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xvdm\" (UniqueName: \"kubernetes.io/projected/b9668ba6-0faf-40eb-a0a2-9d0557167dda-kube-api-access-8xvdm\") on node \"crc\" DevicePath \"\"" Mar 20 15:20:04 crc kubenswrapper[4764]: I0320 15:20:04.784125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567000-r6rgv" event={"ID":"b9668ba6-0faf-40eb-a0a2-9d0557167dda","Type":"ContainerDied","Data":"9517f2653576591d34f5cae2c29df8eb811427f0b6c4722cfd27fef22dfc8dbb"} Mar 20 15:20:04 crc kubenswrapper[4764]: I0320 15:20:04.784466 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9517f2653576591d34f5cae2c29df8eb811427f0b6c4722cfd27fef22dfc8dbb" Mar 20 15:20:04 crc kubenswrapper[4764]: I0320 15:20:04.784177 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567000-r6rgv" Mar 20 15:20:05 crc kubenswrapper[4764]: I0320 15:20:05.314045 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566994-9t58k"] Mar 20 15:20:05 crc kubenswrapper[4764]: I0320 15:20:05.324110 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566994-9t58k"] Mar 20 15:20:07 crc kubenswrapper[4764]: I0320 15:20:07.136999 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b25e8b0-ce3b-48ff-81df-4589b0ec17ea" path="/var/lib/kubelet/pods/0b25e8b0-ce3b-48ff-81df-4589b0ec17ea/volumes" Mar 20 15:20:07 crc kubenswrapper[4764]: I0320 15:20:07.232366 4764 scope.go:117] "RemoveContainer" containerID="5ca67c7488940d22b3e29c7c1ec2d664c3528065a3fbb57b032a20df5efa9e58" Mar 20 15:20:07 crc kubenswrapper[4764]: I0320 15:20:07.264877 4764 scope.go:117] "RemoveContainer" containerID="12add6006235e427d2ecedb6615911851169993dbc5e086bcd33a587843c6ce5" Mar 20 15:20:07 crc kubenswrapper[4764]: I0320 15:20:07.308354 4764 scope.go:117] "RemoveContainer" containerID="26ed4d66726c21a059cb2bce0b8e44bd4110fcd78ec96d1d90e14cb266212972" Mar 20 15:20:14 crc kubenswrapper[4764]: I0320 15:20:14.126305 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:20:14 crc kubenswrapper[4764]: E0320 15:20:14.127267 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:20:29 crc kubenswrapper[4764]: I0320 15:20:29.138588 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:20:29 crc kubenswrapper[4764]: E0320 15:20:29.139982 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:20:41 crc kubenswrapper[4764]: I0320 15:20:41.036108 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-74vjq"] Mar 20 15:20:41 crc kubenswrapper[4764]: I0320 15:20:41.044346 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-44ab-account-create-update-mkdlb"] Mar 20 15:20:41 crc kubenswrapper[4764]: I0320 15:20:41.052339 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-74vjq"] Mar 20 15:20:41 crc kubenswrapper[4764]: I0320 15:20:41.059999 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-44ab-account-create-update-mkdlb"] Mar 20 15:20:41 crc kubenswrapper[4764]: I0320 15:20:41.135145 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe" path="/var/lib/kubelet/pods/178a6de3-2ce6-43d1-951b-e7b3dc5c4cbe/volumes" Mar 20 15:20:41 crc kubenswrapper[4764]: I0320 15:20:41.135751 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2" path="/var/lib/kubelet/pods/f9ac66a3-3f0d-40c9-94d4-01d3c8d91de2/volumes" Mar 20 15:20:43 crc kubenswrapper[4764]: I0320 15:20:43.028167 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-7pbv8"] Mar 20 15:20:43 crc kubenswrapper[4764]: I0320 15:20:43.038529 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c907-account-create-update-flnnn"] Mar 20 15:20:43 crc kubenswrapper[4764]: I0320 15:20:43.046692 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-7pbv8"] Mar 20 15:20:43 crc kubenswrapper[4764]: I0320 15:20:43.055794 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c907-account-create-update-flnnn"] Mar 20 15:20:43 crc kubenswrapper[4764]: I0320 15:20:43.138177 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec8fd1c-0dd8-48d3-8194-5d74198c652e" path="/var/lib/kubelet/pods/aec8fd1c-0dd8-48d3-8194-5d74198c652e/volumes" Mar 20 15:20:43 crc kubenswrapper[4764]: I0320 15:20:43.139449 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4206356-ff23-4e94-b1bb-d27749ca895d" path="/var/lib/kubelet/pods/f4206356-ff23-4e94-b1bb-d27749ca895d/volumes" Mar 20 15:20:44 crc kubenswrapper[4764]: I0320 15:20:44.049419 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e671-account-create-update-np4kd"] Mar 20 15:20:44 crc kubenswrapper[4764]: I0320 15:20:44.069442 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-tnk24"] Mar 20 15:20:44 crc kubenswrapper[4764]: I0320 15:20:44.080315 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e671-account-create-update-np4kd"] Mar 20 15:20:44 crc kubenswrapper[4764]: I0320 15:20:44.090870 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-tnk24"] Mar 20 15:20:44 crc kubenswrapper[4764]: I0320 15:20:44.126679 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:20:44 crc kubenswrapper[4764]: E0320 15:20:44.127113 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:20:45 crc kubenswrapper[4764]: I0320 15:20:45.145050 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fb0f4b5-4acb-421d-9df5-ed34eed17848" path="/var/lib/kubelet/pods/7fb0f4b5-4acb-421d-9df5-ed34eed17848/volumes" Mar 20 15:20:45 crc kubenswrapper[4764]: I0320 15:20:45.146599 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d636f2ec-b9c4-4d44-be52-c1c6a570bf4c" path="/var/lib/kubelet/pods/d636f2ec-b9c4-4d44-be52-c1c6a570bf4c/volumes" Mar 20 15:20:46 crc kubenswrapper[4764]: I0320 15:20:46.035729 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9drjm"] Mar 20 15:20:46 crc kubenswrapper[4764]: I0320 15:20:46.051560 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9drjm"] Mar 20 15:20:47 crc kubenswrapper[4764]: I0320 15:20:47.139884 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87e2b458-a224-4332-8751-c45f85743cb7" path="/var/lib/kubelet/pods/87e2b458-a224-4332-8751-c45f85743cb7/volumes" Mar 20 15:20:59 crc kubenswrapper[4764]: I0320 15:20:59.134112 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:20:59 crc kubenswrapper[4764]: E0320 15:20:59.135185 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:21:07 crc kubenswrapper[4764]: I0320 15:21:07.388064 4764 scope.go:117] "RemoveContainer" containerID="b7c01949f13ac6f2adc1c5d1353984bc20bf37f57280f04d6cecee64bc4ff0fb" Mar 20 15:21:07 crc kubenswrapper[4764]: I0320 15:21:07.439888 4764 scope.go:117] "RemoveContainer" containerID="507773a7464ac1ca2752ad7b1657c72617f1df223657774bcac59a0229b47717" Mar 20 15:21:07 crc kubenswrapper[4764]: I0320 15:21:07.492177 4764 scope.go:117] "RemoveContainer" containerID="a225621c76af64bcf5e5bb68f7f64c06c3b503f05535f0b2ef9e6f84c5f2b116" Mar 20 15:21:07 crc kubenswrapper[4764]: I0320 15:21:07.552587 4764 scope.go:117] "RemoveContainer" containerID="87866399c513616618074196708c942f4b19b7989a488296c5ac172d763c9e02" Mar 20 15:21:07 crc kubenswrapper[4764]: I0320 15:21:07.594756 4764 scope.go:117] "RemoveContainer" containerID="0bae5de10238a7beb8f0a7f3802a329a4df5969222160262a576adb89cbbfcdf" Mar 20 15:21:07 crc kubenswrapper[4764]: I0320 15:21:07.630130 4764 scope.go:117] "RemoveContainer" containerID="9248544ce2256e2693b971c6047c97c5fc06d849b130ad0e5fc7083448a0d67c" Mar 20 15:21:07 crc kubenswrapper[4764]: I0320 15:21:07.668753 4764 scope.go:117] "RemoveContainer" containerID="b00c8d0ce5992a941a7b4a0bb4646136131f15c9829801c3d4f7d8d5a66a62d6" Mar 20 15:21:08 crc kubenswrapper[4764]: I0320 15:21:08.038107 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-329f-account-create-update-qxhhn"] Mar 20 15:21:08 crc kubenswrapper[4764]: I0320 15:21:08.048872 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-329f-account-create-update-qxhhn"] Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.032962 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-2tzqs"] Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.049391 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-2tzqs"] Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.057882 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0631-account-create-update-7kmh8"] Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.067143 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-88d7-account-create-update-p5wpr"] Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.076142 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-v78qk"] Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.086012 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7fdgf"] Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.095841 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-88d7-account-create-update-p5wpr"] Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.106144 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-v78qk"] Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.116439 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7fdgf"] Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.125820 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0631-account-create-update-7kmh8"] Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.138202 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f42236b-4110-466f-8ab4-6ebaafd5e570" path="/var/lib/kubelet/pods/0f42236b-4110-466f-8ab4-6ebaafd5e570/volumes" Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.139030 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d44ec84-1f1d-4477-8441-25159cc06b9e" path="/var/lib/kubelet/pods/2d44ec84-1f1d-4477-8441-25159cc06b9e/volumes" Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.139831 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a827a7e-f360-4a04-9e24-0405b61b9501" path="/var/lib/kubelet/pods/7a827a7e-f360-4a04-9e24-0405b61b9501/volumes" Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.140539 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="842127c9-b79d-4787-aa8f-8717e266f790" path="/var/lib/kubelet/pods/842127c9-b79d-4787-aa8f-8717e266f790/volumes" Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.141979 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a90e7d64-0e27-4264-af29-d75b18ab3156" path="/var/lib/kubelet/pods/a90e7d64-0e27-4264-af29-d75b18ab3156/volumes" Mar 20 15:21:09 crc kubenswrapper[4764]: I0320 15:21:09.142685 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe206186-2715-4930-abb1-5917419fd021" path="/var/lib/kubelet/pods/fe206186-2715-4930-abb1-5917419fd021/volumes" Mar 20 15:21:12 crc kubenswrapper[4764]: I0320 15:21:12.127064 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:21:12 crc kubenswrapper[4764]: E0320 15:21:12.129610 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:21:15 crc kubenswrapper[4764]: I0320 15:21:15.043616 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wmzgg"] Mar 20 15:21:15 crc kubenswrapper[4764]: I0320 15:21:15.055887 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wmzgg"] Mar 20 15:21:15 crc kubenswrapper[4764]: I0320 15:21:15.138194 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b825928a-6583-4399-85ba-559a5f3081a0" path="/var/lib/kubelet/pods/b825928a-6583-4399-85ba-559a5f3081a0/volumes" Mar 20 15:21:19 crc kubenswrapper[4764]: I0320 15:21:19.551022 4764 generic.go:334] "Generic (PLEG): container finished" podID="297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c" containerID="2f61cfbed5baefd2a2a9df9ee588f7666fc5d46d725903304d650edfab7bf00f" exitCode=0 Mar 20 15:21:19 crc kubenswrapper[4764]: I0320 15:21:19.551103 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" event={"ID":"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c","Type":"ContainerDied","Data":"2f61cfbed5baefd2a2a9df9ee588f7666fc5d46d725903304d650edfab7bf00f"} Mar 20 15:21:20 crc kubenswrapper[4764]: I0320 15:21:20.978238 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.053558 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-inventory\") pod \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\" (UID: \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\") " Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.053657 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-ssh-key-openstack-edpm-ipam\") pod \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\" (UID: \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\") " Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.053769 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48ldv\" (UniqueName: \"kubernetes.io/projected/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-kube-api-access-48ldv\") pod \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\" (UID: \"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c\") " Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.059479 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-kube-api-access-48ldv" (OuterVolumeSpecName: "kube-api-access-48ldv") pod "297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c" (UID: "297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c"). InnerVolumeSpecName "kube-api-access-48ldv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.085973 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c" (UID: "297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.086404 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-inventory" (OuterVolumeSpecName: "inventory") pod "297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c" (UID: "297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.155414 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.155439 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.155452 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48ldv\" (UniqueName: \"kubernetes.io/projected/297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c-kube-api-access-48ldv\") on node \"crc\" DevicePath \"\"" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.571541 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" event={"ID":"297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c","Type":"ContainerDied","Data":"174c16b8e73298820eb413bc9c7a432018b4c95aa36790ad8e2be148a0463543"} Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.571604 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="174c16b8e73298820eb413bc9c7a432018b4c95aa36790ad8e2be148a0463543" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.571664 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.659987 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw"] Mar 20 15:21:21 crc kubenswrapper[4764]: E0320 15:21:21.660430 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9668ba6-0faf-40eb-a0a2-9d0557167dda" containerName="oc" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.660452 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9668ba6-0faf-40eb-a0a2-9d0557167dda" containerName="oc" Mar 20 15:21:21 crc kubenswrapper[4764]: E0320 15:21:21.660484 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.660494 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.660694 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.660713 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9668ba6-0faf-40eb-a0a2-9d0557167dda" containerName="oc" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.661451 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.664691 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.664690 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.665215 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.666212 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.675888 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw"] Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.768734 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26d7j\" (UniqueName: \"kubernetes.io/projected/95634505-7484-4887-973b-a91a632c48d1-kube-api-access-26d7j\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw\" (UID: \"95634505-7484-4887-973b-a91a632c48d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.768893 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95634505-7484-4887-973b-a91a632c48d1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw\" (UID: \"95634505-7484-4887-973b-a91a632c48d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.768928 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95634505-7484-4887-973b-a91a632c48d1-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw\" (UID: \"95634505-7484-4887-973b-a91a632c48d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.870169 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95634505-7484-4887-973b-a91a632c48d1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw\" (UID: \"95634505-7484-4887-973b-a91a632c48d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.870707 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95634505-7484-4887-973b-a91a632c48d1-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw\" (UID: \"95634505-7484-4887-973b-a91a632c48d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.870802 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26d7j\" (UniqueName: \"kubernetes.io/projected/95634505-7484-4887-973b-a91a632c48d1-kube-api-access-26d7j\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw\" (UID: \"95634505-7484-4887-973b-a91a632c48d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.875896 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95634505-7484-4887-973b-a91a632c48d1-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw\" (UID: \"95634505-7484-4887-973b-a91a632c48d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.876263 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95634505-7484-4887-973b-a91a632c48d1-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw\" (UID: \"95634505-7484-4887-973b-a91a632c48d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.889072 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26d7j\" (UniqueName: \"kubernetes.io/projected/95634505-7484-4887-973b-a91a632c48d1-kube-api-access-26d7j\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw\" (UID: \"95634505-7484-4887-973b-a91a632c48d1\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:21:21 crc kubenswrapper[4764]: I0320 15:21:21.982049 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:21:22 crc kubenswrapper[4764]: I0320 15:21:22.552709 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw"] Mar 20 15:21:22 crc kubenswrapper[4764]: I0320 15:21:22.559923 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:21:22 crc kubenswrapper[4764]: I0320 15:21:22.581315 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" event={"ID":"95634505-7484-4887-973b-a91a632c48d1","Type":"ContainerStarted","Data":"800431beb29408080abd4d3fa6ed70fd0fc4289dbe177d5cfe318033946c0ec6"} Mar 20 15:21:23 crc kubenswrapper[4764]: I0320 15:21:23.597480 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" event={"ID":"95634505-7484-4887-973b-a91a632c48d1","Type":"ContainerStarted","Data":"919d96ffb99636d81f3fa81effb19d99d6d7ed82886911ea06f920076704f35b"} Mar 20 15:21:23 crc kubenswrapper[4764]: I0320 15:21:23.616426 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" podStartSLOduration=1.892186468 podStartE2EDuration="2.616409323s" podCreationTimestamp="2026-03-20 15:21:21 +0000 UTC" firstStartedPulling="2026-03-20 15:21:22.559618524 +0000 UTC m=+1804.175807653" lastFinishedPulling="2026-03-20 15:21:23.283841359 +0000 UTC m=+1804.900030508" observedRunningTime="2026-03-20 15:21:23.614405589 +0000 UTC m=+1805.230594718" watchObservedRunningTime="2026-03-20 15:21:23.616409323 +0000 UTC m=+1805.232598452" Mar 20 15:21:25 crc kubenswrapper[4764]: I0320 15:21:25.126721 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:21:25 crc kubenswrapper[4764]: E0320 15:21:25.127311 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:21:40 crc kubenswrapper[4764]: I0320 15:21:40.126354 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:21:40 crc kubenswrapper[4764]: E0320 15:21:40.127543 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:21:45 crc kubenswrapper[4764]: I0320 15:21:45.044763 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-klrf6"] Mar 20 15:21:45 crc kubenswrapper[4764]: I0320 15:21:45.059785 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-klrf6"] Mar 20 15:21:45 crc kubenswrapper[4764]: I0320 15:21:45.146335 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adc582c0-f416-4991-89c7-9ddb850c0f2b" path="/var/lib/kubelet/pods/adc582c0-f416-4991-89c7-9ddb850c0f2b/volumes" Mar 20 15:21:47 crc kubenswrapper[4764]: I0320 15:21:47.036257 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-p5v7z"] Mar 20 15:21:47 crc kubenswrapper[4764]: I0320 15:21:47.042398 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-p5v7z"] Mar 20 15:21:47 crc kubenswrapper[4764]: I0320 15:21:47.136746 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d3ec055-05fb-42c0-bb97-342be3f1e32d" path="/var/lib/kubelet/pods/8d3ec055-05fb-42c0-bb97-342be3f1e32d/volumes" Mar 20 15:21:53 crc kubenswrapper[4764]: I0320 15:21:53.126031 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:21:53 crc kubenswrapper[4764]: E0320 15:21:53.126576 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:21:55 crc kubenswrapper[4764]: I0320 15:21:55.036547 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-q4nl4"] Mar 20 15:21:55 crc kubenswrapper[4764]: I0320 15:21:55.051865 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-xhldv"] Mar 20 15:21:55 crc kubenswrapper[4764]: I0320 15:21:55.063785 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-q4nl4"] Mar 20 15:21:55 crc kubenswrapper[4764]: I0320 15:21:55.072329 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8mz56"] Mar 20 15:21:55 crc kubenswrapper[4764]: I0320 15:21:55.079877 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-xhldv"] Mar 20 15:21:55 crc kubenswrapper[4764]: I0320 15:21:55.089667 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8mz56"] Mar 20 15:21:55 crc kubenswrapper[4764]: I0320 15:21:55.139534 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329bd08e-9bf1-4c6e-b234-e99022daa848" path="/var/lib/kubelet/pods/329bd08e-9bf1-4c6e-b234-e99022daa848/volumes" Mar 20 15:21:55 crc kubenswrapper[4764]: I0320 15:21:55.140996 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8a0840-2a17-42d6-94e5-19653a16ff80" path="/var/lib/kubelet/pods/7a8a0840-2a17-42d6-94e5-19653a16ff80/volumes" Mar 20 15:21:55 crc kubenswrapper[4764]: I0320 15:21:55.142908 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df97d5e8-2808-4bef-9fad-b54c27554d23" path="/var/lib/kubelet/pods/df97d5e8-2808-4bef-9fad-b54c27554d23/volumes" Mar 20 15:22:00 crc kubenswrapper[4764]: I0320 15:22:00.147520 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567002-nvm8f"] Mar 20 15:22:00 crc kubenswrapper[4764]: I0320 15:22:00.149021 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567002-nvm8f" Mar 20 15:22:00 crc kubenswrapper[4764]: I0320 15:22:00.150978 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:22:00 crc kubenswrapper[4764]: I0320 15:22:00.153435 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:22:00 crc kubenswrapper[4764]: I0320 15:22:00.153653 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:22:00 crc kubenswrapper[4764]: I0320 15:22:00.157599 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567002-nvm8f"] Mar 20 15:22:00 crc kubenswrapper[4764]: I0320 15:22:00.211135 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhgbt\" (UniqueName: \"kubernetes.io/projected/46f06100-41b2-4f66-8e59-3ccb3cd7c38b-kube-api-access-jhgbt\") pod \"auto-csr-approver-29567002-nvm8f\" (UID: \"46f06100-41b2-4f66-8e59-3ccb3cd7c38b\") " pod="openshift-infra/auto-csr-approver-29567002-nvm8f" Mar 20 15:22:00 crc kubenswrapper[4764]: I0320 15:22:00.313725 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhgbt\" (UniqueName: \"kubernetes.io/projected/46f06100-41b2-4f66-8e59-3ccb3cd7c38b-kube-api-access-jhgbt\") pod \"auto-csr-approver-29567002-nvm8f\" (UID: \"46f06100-41b2-4f66-8e59-3ccb3cd7c38b\") " pod="openshift-infra/auto-csr-approver-29567002-nvm8f" Mar 20 15:22:00 crc kubenswrapper[4764]: I0320 15:22:00.334932 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhgbt\" (UniqueName: \"kubernetes.io/projected/46f06100-41b2-4f66-8e59-3ccb3cd7c38b-kube-api-access-jhgbt\") pod \"auto-csr-approver-29567002-nvm8f\" (UID: \"46f06100-41b2-4f66-8e59-3ccb3cd7c38b\") " pod="openshift-infra/auto-csr-approver-29567002-nvm8f" Mar 20 15:22:00 crc kubenswrapper[4764]: I0320 15:22:00.472276 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567002-nvm8f" Mar 20 15:22:00 crc kubenswrapper[4764]: I0320 15:22:00.932268 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567002-nvm8f"] Mar 20 15:22:00 crc kubenswrapper[4764]: I0320 15:22:00.958280 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567002-nvm8f" event={"ID":"46f06100-41b2-4f66-8e59-3ccb3cd7c38b","Type":"ContainerStarted","Data":"082fd0e985aec8c1441f36e8dca02b48c26fe9a22a27edde4a49be4beb9d3a9e"} Mar 20 15:22:02 crc kubenswrapper[4764]: I0320 15:22:02.980260 4764 generic.go:334] "Generic (PLEG): container finished" podID="46f06100-41b2-4f66-8e59-3ccb3cd7c38b" containerID="3e3d42a74df62b1b836aacb23569107418c862474e9e8d6d8a31123f17e0de5a" exitCode=0 Mar 20 15:22:02 crc kubenswrapper[4764]: I0320 15:22:02.980705 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567002-nvm8f" event={"ID":"46f06100-41b2-4f66-8e59-3ccb3cd7c38b","Type":"ContainerDied","Data":"3e3d42a74df62b1b836aacb23569107418c862474e9e8d6d8a31123f17e0de5a"} Mar 20 15:22:04 crc kubenswrapper[4764]: I0320 15:22:04.335124 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567002-nvm8f" Mar 20 15:22:04 crc kubenswrapper[4764]: I0320 15:22:04.411043 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhgbt\" (UniqueName: \"kubernetes.io/projected/46f06100-41b2-4f66-8e59-3ccb3cd7c38b-kube-api-access-jhgbt\") pod \"46f06100-41b2-4f66-8e59-3ccb3cd7c38b\" (UID: \"46f06100-41b2-4f66-8e59-3ccb3cd7c38b\") " Mar 20 15:22:04 crc kubenswrapper[4764]: I0320 15:22:04.417072 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46f06100-41b2-4f66-8e59-3ccb3cd7c38b-kube-api-access-jhgbt" (OuterVolumeSpecName: "kube-api-access-jhgbt") pod "46f06100-41b2-4f66-8e59-3ccb3cd7c38b" (UID: "46f06100-41b2-4f66-8e59-3ccb3cd7c38b"). InnerVolumeSpecName "kube-api-access-jhgbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:22:04 crc kubenswrapper[4764]: I0320 15:22:04.513673 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhgbt\" (UniqueName: \"kubernetes.io/projected/46f06100-41b2-4f66-8e59-3ccb3cd7c38b-kube-api-access-jhgbt\") on node \"crc\" DevicePath \"\"" Mar 20 15:22:04 crc kubenswrapper[4764]: I0320 15:22:04.998294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567002-nvm8f" event={"ID":"46f06100-41b2-4f66-8e59-3ccb3cd7c38b","Type":"ContainerDied","Data":"082fd0e985aec8c1441f36e8dca02b48c26fe9a22a27edde4a49be4beb9d3a9e"} Mar 20 15:22:04 crc kubenswrapper[4764]: I0320 15:22:04.998350 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="082fd0e985aec8c1441f36e8dca02b48c26fe9a22a27edde4a49be4beb9d3a9e" Mar 20 15:22:04 crc kubenswrapper[4764]: I0320 15:22:04.998445 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567002-nvm8f" Mar 20 15:22:05 crc kubenswrapper[4764]: I0320 15:22:05.403709 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566996-cz9h9"] Mar 20 15:22:05 crc kubenswrapper[4764]: I0320 15:22:05.411793 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566996-cz9h9"] Mar 20 15:22:06 crc kubenswrapper[4764]: I0320 15:22:06.127631 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:22:06 crc kubenswrapper[4764]: E0320 15:22:06.127912 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:22:07 crc kubenswrapper[4764]: I0320 15:22:07.138553 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d3f786-2402-4bee-ba83-c97c9c8b2209" path="/var/lib/kubelet/pods/08d3f786-2402-4bee-ba83-c97c9c8b2209/volumes" Mar 20 15:22:07 crc kubenswrapper[4764]: I0320 15:22:07.822915 4764 scope.go:117] "RemoveContainer" containerID="936b27acc46d1ccbb24af8fbf737a64c6218c1631d1a86cdb9edb1a0a9c2e08e" Mar 20 15:22:07 crc kubenswrapper[4764]: I0320 15:22:07.874128 4764 scope.go:117] "RemoveContainer" containerID="77ddb731624b68f78077df83ad7fc5d98c9260a4f1def31d45218fa7894f5400" Mar 20 15:22:07 crc kubenswrapper[4764]: I0320 15:22:07.932813 4764 scope.go:117] "RemoveContainer" containerID="22ec6044cdf531fba5b90673244c98c7b116936f8515281f0071eec8049aba68" Mar 20 15:22:07 crc kubenswrapper[4764]: I0320 15:22:07.982592 4764 scope.go:117] "RemoveContainer" containerID="5da5d2fb8d72b9c9b4b323b0b0b58b216e7e9aefbfeb0d6b2465effdb30193d4" Mar 20 15:22:08 crc kubenswrapper[4764]: I0320 15:22:08.051144 4764 scope.go:117] "RemoveContainer" containerID="107825f0c41ee53148d8ac0a49582e7792b029b37b9e4d8765d2dd015dd2650e" Mar 20 15:22:08 crc kubenswrapper[4764]: I0320 15:22:08.076469 4764 scope.go:117] "RemoveContainer" containerID="d35f6e46cb888f183b770e4a945c501f645c5991ed370bf8765dc8619fb449b0" Mar 20 15:22:08 crc kubenswrapper[4764]: I0320 15:22:08.118075 4764 scope.go:117] "RemoveContainer" containerID="1483c8585e5e4572c9912caabc2e2629a26afc72bb3fbedb1d51877c7290733d" Mar 20 15:22:08 crc kubenswrapper[4764]: I0320 15:22:08.140009 4764 scope.go:117] "RemoveContainer" containerID="1606e8ccc1db1c2c734072f286bcbebadc743142ff9a6e45d65e106544241caf" Mar 20 15:22:08 crc kubenswrapper[4764]: I0320 15:22:08.182452 4764 scope.go:117] "RemoveContainer" containerID="ad2e1c786cbc1c21a38880b3bb055dda6c21dfc3c98d0181be6a48931b287ccf" Mar 20 15:22:08 crc kubenswrapper[4764]: I0320 15:22:08.228558 4764 scope.go:117] "RemoveContainer" containerID="72df290a11c89fd7dd7e7193ad210c40d5174fa09d19a9e166161f55e97e46ce" Mar 20 15:22:08 crc kubenswrapper[4764]: I0320 15:22:08.250684 4764 scope.go:117] "RemoveContainer" containerID="e5563bf00571bb7280679430090642c4974f87cc237824756668b0d90105aacb" Mar 20 15:22:08 crc kubenswrapper[4764]: I0320 15:22:08.284285 4764 scope.go:117] "RemoveContainer" containerID="8eff8c0ac05e54673fbf6c7e2b62165935fca4bb2a85b981663fe393db72cd8f" Mar 20 15:22:08 crc kubenswrapper[4764]: I0320 15:22:08.326042 4764 scope.go:117] "RemoveContainer" containerID="ed9ab4db0648fec83e64bc230bb94ffa068fcdcaea68daef1abd99f722599c2d" Mar 20 15:22:14 crc kubenswrapper[4764]: I0320 15:22:14.051005 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6zj6m"] Mar 20 15:22:14 crc kubenswrapper[4764]: I0320 15:22:14.064064 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6zj6m"] Mar 20 15:22:15 crc kubenswrapper[4764]: I0320 15:22:15.135907 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="337e2278-00e7-428e-97c1-c8d940d83aa4" path="/var/lib/kubelet/pods/337e2278-00e7-428e-97c1-c8d940d83aa4/volumes" Mar 20 15:22:21 crc kubenswrapper[4764]: I0320 15:22:21.127087 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:22:21 crc kubenswrapper[4764]: E0320 15:22:21.127952 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:22:31 crc kubenswrapper[4764]: I0320 15:22:31.265060 4764 generic.go:334] "Generic (PLEG): container finished" podID="95634505-7484-4887-973b-a91a632c48d1" containerID="919d96ffb99636d81f3fa81effb19d99d6d7ed82886911ea06f920076704f35b" exitCode=0 Mar 20 15:22:31 crc kubenswrapper[4764]: I0320 15:22:31.265113 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" event={"ID":"95634505-7484-4887-973b-a91a632c48d1","Type":"ContainerDied","Data":"919d96ffb99636d81f3fa81effb19d99d6d7ed82886911ea06f920076704f35b"} Mar 20 15:22:32 crc kubenswrapper[4764]: I0320 15:22:32.701073 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:22:32 crc kubenswrapper[4764]: I0320 15:22:32.815199 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95634505-7484-4887-973b-a91a632c48d1-ssh-key-openstack-edpm-ipam\") pod \"95634505-7484-4887-973b-a91a632c48d1\" (UID: \"95634505-7484-4887-973b-a91a632c48d1\") " Mar 20 15:22:32 crc kubenswrapper[4764]: I0320 15:22:32.815362 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26d7j\" (UniqueName: \"kubernetes.io/projected/95634505-7484-4887-973b-a91a632c48d1-kube-api-access-26d7j\") pod \"95634505-7484-4887-973b-a91a632c48d1\" (UID: \"95634505-7484-4887-973b-a91a632c48d1\") " Mar 20 15:22:32 crc kubenswrapper[4764]: I0320 15:22:32.815401 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95634505-7484-4887-973b-a91a632c48d1-inventory\") pod \"95634505-7484-4887-973b-a91a632c48d1\" (UID: \"95634505-7484-4887-973b-a91a632c48d1\") " Mar 20 15:22:32 crc kubenswrapper[4764]: I0320 15:22:32.820315 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95634505-7484-4887-973b-a91a632c48d1-kube-api-access-26d7j" (OuterVolumeSpecName: "kube-api-access-26d7j") pod "95634505-7484-4887-973b-a91a632c48d1" (UID: "95634505-7484-4887-973b-a91a632c48d1"). InnerVolumeSpecName "kube-api-access-26d7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:22:32 crc kubenswrapper[4764]: I0320 15:22:32.846427 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95634505-7484-4887-973b-a91a632c48d1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "95634505-7484-4887-973b-a91a632c48d1" (UID: "95634505-7484-4887-973b-a91a632c48d1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:22:32 crc kubenswrapper[4764]: I0320 15:22:32.859828 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95634505-7484-4887-973b-a91a632c48d1-inventory" (OuterVolumeSpecName: "inventory") pod "95634505-7484-4887-973b-a91a632c48d1" (UID: "95634505-7484-4887-973b-a91a632c48d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:22:32 crc kubenswrapper[4764]: I0320 15:22:32.917119 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/95634505-7484-4887-973b-a91a632c48d1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:22:32 crc kubenswrapper[4764]: I0320 15:22:32.917153 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26d7j\" (UniqueName: \"kubernetes.io/projected/95634505-7484-4887-973b-a91a632c48d1-kube-api-access-26d7j\") on node \"crc\" DevicePath \"\"" Mar 20 15:22:32 crc kubenswrapper[4764]: I0320 15:22:32.917163 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95634505-7484-4887-973b-a91a632c48d1-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.126813 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:22:33 crc kubenswrapper[4764]: E0320 15:22:33.127254 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.284273 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" event={"ID":"95634505-7484-4887-973b-a91a632c48d1","Type":"ContainerDied","Data":"800431beb29408080abd4d3fa6ed70fd0fc4289dbe177d5cfe318033946c0ec6"} Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.284311 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800431beb29408080abd4d3fa6ed70fd0fc4289dbe177d5cfe318033946c0ec6" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.284352 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.377608 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6"] Mar 20 15:22:33 crc kubenswrapper[4764]: E0320 15:22:33.377958 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95634505-7484-4887-973b-a91a632c48d1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.377974 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="95634505-7484-4887-973b-a91a632c48d1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 15:22:33 crc kubenswrapper[4764]: E0320 15:22:33.377982 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f06100-41b2-4f66-8e59-3ccb3cd7c38b" containerName="oc" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.377989 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f06100-41b2-4f66-8e59-3ccb3cd7c38b" containerName="oc" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.378156 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="95634505-7484-4887-973b-a91a632c48d1" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.378197 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f06100-41b2-4f66-8e59-3ccb3cd7c38b" containerName="oc" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.378802 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.380981 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.388991 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.389316 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.389370 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.391561 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6"] Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.528213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dp5s\" (UniqueName: \"kubernetes.io/projected/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-kube-api-access-5dp5s\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6\" (UID: \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.528737 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6\" (UID: \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.528801 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6\" (UID: \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.630129 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6\" (UID: \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.630203 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6\" (UID: \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.630260 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dp5s\" (UniqueName: \"kubernetes.io/projected/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-kube-api-access-5dp5s\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6\" (UID: \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.635542 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6\" (UID: \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.636777 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6\" (UID: \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.664710 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dp5s\" (UniqueName: \"kubernetes.io/projected/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-kube-api-access-5dp5s\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6\" (UID: \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:33 crc kubenswrapper[4764]: I0320 15:22:33.698461 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:34 crc kubenswrapper[4764]: I0320 15:22:34.329223 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6"] Mar 20 15:22:35 crc kubenswrapper[4764]: I0320 15:22:35.307725 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" event={"ID":"e14aaa5f-0501-4ce2-b63a-08fde03ed17a","Type":"ContainerStarted","Data":"233b5ccfc07407c9cef9f06a5e69892239c70100db4a51834f8041c7368cf822"} Mar 20 15:22:35 crc kubenswrapper[4764]: I0320 15:22:35.308259 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" event={"ID":"e14aaa5f-0501-4ce2-b63a-08fde03ed17a","Type":"ContainerStarted","Data":"f38109efffbb1b32b8463fdcbe4c6fff107d6393fb9b1e8b03f8f5de7488b82d"} Mar 20 15:22:35 crc kubenswrapper[4764]: I0320 15:22:35.326904 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" podStartSLOduration=1.8209500379999999 podStartE2EDuration="2.326883481s" podCreationTimestamp="2026-03-20 15:22:33 +0000 UTC" firstStartedPulling="2026-03-20 15:22:34.338861104 +0000 UTC m=+1875.955050233" lastFinishedPulling="2026-03-20 15:22:34.844794547 +0000 UTC m=+1876.460983676" observedRunningTime="2026-03-20 15:22:35.323071321 +0000 UTC m=+1876.939260450" watchObservedRunningTime="2026-03-20 15:22:35.326883481 +0000 UTC m=+1876.943072610" Mar 20 15:22:40 crc kubenswrapper[4764]: I0320 15:22:40.355450 4764 generic.go:334] "Generic (PLEG): container finished" podID="e14aaa5f-0501-4ce2-b63a-08fde03ed17a" containerID="233b5ccfc07407c9cef9f06a5e69892239c70100db4a51834f8041c7368cf822" exitCode=0 Mar 20 15:22:40 crc kubenswrapper[4764]: I0320 15:22:40.355523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" event={"ID":"e14aaa5f-0501-4ce2-b63a-08fde03ed17a","Type":"ContainerDied","Data":"233b5ccfc07407c9cef9f06a5e69892239c70100db4a51834f8041c7368cf822"} Mar 20 15:22:41 crc kubenswrapper[4764]: I0320 15:22:41.782424 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:41 crc kubenswrapper[4764]: I0320 15:22:41.889842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dp5s\" (UniqueName: \"kubernetes.io/projected/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-kube-api-access-5dp5s\") pod \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\" (UID: \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\") " Mar 20 15:22:41 crc kubenswrapper[4764]: I0320 15:22:41.889959 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-ssh-key-openstack-edpm-ipam\") pod \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\" (UID: \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\") " Mar 20 15:22:41 crc kubenswrapper[4764]: I0320 15:22:41.890023 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-inventory\") pod \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\" (UID: \"e14aaa5f-0501-4ce2-b63a-08fde03ed17a\") " Mar 20 15:22:41 crc kubenswrapper[4764]: I0320 15:22:41.898527 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-kube-api-access-5dp5s" (OuterVolumeSpecName: "kube-api-access-5dp5s") pod "e14aaa5f-0501-4ce2-b63a-08fde03ed17a" (UID: "e14aaa5f-0501-4ce2-b63a-08fde03ed17a"). InnerVolumeSpecName "kube-api-access-5dp5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:22:41 crc kubenswrapper[4764]: I0320 15:22:41.922357 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-inventory" (OuterVolumeSpecName: "inventory") pod "e14aaa5f-0501-4ce2-b63a-08fde03ed17a" (UID: "e14aaa5f-0501-4ce2-b63a-08fde03ed17a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:22:41 crc kubenswrapper[4764]: I0320 15:22:41.927738 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e14aaa5f-0501-4ce2-b63a-08fde03ed17a" (UID: "e14aaa5f-0501-4ce2-b63a-08fde03ed17a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:22:41 crc kubenswrapper[4764]: I0320 15:22:41.993131 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:22:41 crc kubenswrapper[4764]: I0320 15:22:41.993190 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:22:41 crc kubenswrapper[4764]: I0320 15:22:41.993207 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dp5s\" (UniqueName: \"kubernetes.io/projected/e14aaa5f-0501-4ce2-b63a-08fde03ed17a-kube-api-access-5dp5s\") on node \"crc\" DevicePath \"\"" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.372148 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" event={"ID":"e14aaa5f-0501-4ce2-b63a-08fde03ed17a","Type":"ContainerDied","Data":"f38109efffbb1b32b8463fdcbe4c6fff107d6393fb9b1e8b03f8f5de7488b82d"} Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.372196 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f38109efffbb1b32b8463fdcbe4c6fff107d6393fb9b1e8b03f8f5de7488b82d" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.372282 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.464233 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf"] Mar 20 15:22:42 crc kubenswrapper[4764]: E0320 15:22:42.464703 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e14aaa5f-0501-4ce2-b63a-08fde03ed17a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.464727 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e14aaa5f-0501-4ce2-b63a-08fde03ed17a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.464931 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e14aaa5f-0501-4ce2-b63a-08fde03ed17a" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.465665 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.470476 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.470599 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.470652 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.470839 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.483404 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf"] Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.501948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78596cc8-76e1-4603-b970-b59b504531c3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pmxrf\" (UID: \"78596cc8-76e1-4603-b970-b59b504531c3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.502005 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wss4z\" (UniqueName: \"kubernetes.io/projected/78596cc8-76e1-4603-b970-b59b504531c3-kube-api-access-wss4z\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pmxrf\" (UID: \"78596cc8-76e1-4603-b970-b59b504531c3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.502114 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78596cc8-76e1-4603-b970-b59b504531c3-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pmxrf\" (UID: \"78596cc8-76e1-4603-b970-b59b504531c3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.603554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78596cc8-76e1-4603-b970-b59b504531c3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pmxrf\" (UID: \"78596cc8-76e1-4603-b970-b59b504531c3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.603625 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wss4z\" (UniqueName: \"kubernetes.io/projected/78596cc8-76e1-4603-b970-b59b504531c3-kube-api-access-wss4z\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pmxrf\" (UID: \"78596cc8-76e1-4603-b970-b59b504531c3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.603765 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78596cc8-76e1-4603-b970-b59b504531c3-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pmxrf\" (UID: \"78596cc8-76e1-4603-b970-b59b504531c3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.608177 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78596cc8-76e1-4603-b970-b59b504531c3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pmxrf\" (UID: \"78596cc8-76e1-4603-b970-b59b504531c3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.608777 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78596cc8-76e1-4603-b970-b59b504531c3-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pmxrf\" (UID: \"78596cc8-76e1-4603-b970-b59b504531c3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.623483 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wss4z\" (UniqueName: \"kubernetes.io/projected/78596cc8-76e1-4603-b970-b59b504531c3-kube-api-access-wss4z\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pmxrf\" (UID: \"78596cc8-76e1-4603-b970-b59b504531c3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:22:42 crc kubenswrapper[4764]: I0320 15:22:42.786217 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:22:43 crc kubenswrapper[4764]: I0320 15:22:43.374620 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf"] Mar 20 15:22:43 crc kubenswrapper[4764]: I0320 15:22:43.385167 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" event={"ID":"78596cc8-76e1-4603-b970-b59b504531c3","Type":"ContainerStarted","Data":"ae3addf092c8a903313ae85e71c7591e3cf0838b8c996d5a0231c50c21ad56ac"} Mar 20 15:22:44 crc kubenswrapper[4764]: I0320 15:22:44.126356 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:22:44 crc kubenswrapper[4764]: I0320 15:22:44.395918 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"682d34616f4f512a04218ac49f94a47dab3b53e7ea4a05247eddd63a6d004a5c"} Mar 20 15:22:44 crc kubenswrapper[4764]: I0320 15:22:44.397707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" event={"ID":"78596cc8-76e1-4603-b970-b59b504531c3","Type":"ContainerStarted","Data":"947ec6001eab3f1556a296ad344aa7910e54e6933850ad75a06258c4705f55cd"} Mar 20 15:22:44 crc kubenswrapper[4764]: I0320 15:22:44.448163 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" podStartSLOduration=1.8235690679999998 podStartE2EDuration="2.448147276s" podCreationTimestamp="2026-03-20 15:22:42 +0000 UTC" firstStartedPulling="2026-03-20 15:22:43.372469171 +0000 UTC m=+1884.988658310" lastFinishedPulling="2026-03-20 15:22:43.997047369 +0000 UTC m=+1885.613236518" observedRunningTime="2026-03-20 15:22:44.445117004 +0000 UTC m=+1886.061306133" watchObservedRunningTime="2026-03-20 15:22:44.448147276 +0000 UTC m=+1886.064336405" Mar 20 15:23:06 crc kubenswrapper[4764]: I0320 15:23:06.047961 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-g8f4s"] Mar 20 15:23:06 crc kubenswrapper[4764]: I0320 15:23:06.067324 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-g8f4s"] Mar 20 15:23:07 crc kubenswrapper[4764]: I0320 15:23:07.188906 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6" path="/var/lib/kubelet/pods/fbbbb9bb-ae6f-4df8-9c52-b8a4e8b273e6/volumes" Mar 20 15:23:08 crc kubenswrapper[4764]: I0320 15:23:08.029143 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-nwrmx"] Mar 20 15:23:08 crc kubenswrapper[4764]: I0320 15:23:08.037766 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-nwrmx"] Mar 20 15:23:08 crc kubenswrapper[4764]: I0320 15:23:08.527543 4764 scope.go:117] "RemoveContainer" containerID="d792c22067c97259c38be96cc357702c6b4ad5618748f625c5a9e3dce4bdb43e" Mar 20 15:23:08 crc kubenswrapper[4764]: I0320 15:23:08.552057 4764 scope.go:117] "RemoveContainer" containerID="c29b5ce861c58e63c1aa72cf4eda65bae641e857b78531c017dca685c40e85b3" Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.036234 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-38d4-account-create-update-4zn9m"] Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.047545 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ffae-account-create-update-vn59x"] Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.059223 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-9jbjr"] Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.071431 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cc83-account-create-update-rcg5p"] Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.078607 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-38d4-account-create-update-4zn9m"] Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.087208 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cc83-account-create-update-rcg5p"] Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.094325 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ffae-account-create-update-vn59x"] Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.101533 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-9jbjr"] Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.138520 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="318faba6-6466-4f38-8f29-b01709e93bea" path="/var/lib/kubelet/pods/318faba6-6466-4f38-8f29-b01709e93bea/volumes" Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.139383 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e806e8a-b9b4-47eb-bd27-a70a53705c32" path="/var/lib/kubelet/pods/7e806e8a-b9b4-47eb-bd27-a70a53705c32/volumes" Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.140084 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc6bf917-4144-4e4a-9a7a-63aae89d8ad3" path="/var/lib/kubelet/pods/cc6bf917-4144-4e4a-9a7a-63aae89d8ad3/volumes" Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.140774 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3ec590-49ca-4806-9a9e-5699c798e051" path="/var/lib/kubelet/pods/ce3ec590-49ca-4806-9a9e-5699c798e051/volumes" Mar 20 15:23:09 crc kubenswrapper[4764]: I0320 15:23:09.142379 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e15c13f0-2f86-4aba-b109-bac20813f7c6" path="/var/lib/kubelet/pods/e15c13f0-2f86-4aba-b109-bac20813f7c6/volumes" Mar 20 15:23:20 crc kubenswrapper[4764]: I0320 15:23:20.750684 4764 generic.go:334] "Generic (PLEG): container finished" podID="78596cc8-76e1-4603-b970-b59b504531c3" containerID="947ec6001eab3f1556a296ad344aa7910e54e6933850ad75a06258c4705f55cd" exitCode=0 Mar 20 15:23:20 crc kubenswrapper[4764]: I0320 15:23:20.750780 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" event={"ID":"78596cc8-76e1-4603-b970-b59b504531c3","Type":"ContainerDied","Data":"947ec6001eab3f1556a296ad344aa7910e54e6933850ad75a06258c4705f55cd"} Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.219984 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.390952 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wss4z\" (UniqueName: \"kubernetes.io/projected/78596cc8-76e1-4603-b970-b59b504531c3-kube-api-access-wss4z\") pod \"78596cc8-76e1-4603-b970-b59b504531c3\" (UID: \"78596cc8-76e1-4603-b970-b59b504531c3\") " Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.391132 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78596cc8-76e1-4603-b970-b59b504531c3-inventory\") pod \"78596cc8-76e1-4603-b970-b59b504531c3\" (UID: \"78596cc8-76e1-4603-b970-b59b504531c3\") " Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.391224 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78596cc8-76e1-4603-b970-b59b504531c3-ssh-key-openstack-edpm-ipam\") pod \"78596cc8-76e1-4603-b970-b59b504531c3\" (UID: \"78596cc8-76e1-4603-b970-b59b504531c3\") " Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.398618 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78596cc8-76e1-4603-b970-b59b504531c3-kube-api-access-wss4z" (OuterVolumeSpecName: "kube-api-access-wss4z") pod "78596cc8-76e1-4603-b970-b59b504531c3" (UID: "78596cc8-76e1-4603-b970-b59b504531c3"). InnerVolumeSpecName "kube-api-access-wss4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.428022 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78596cc8-76e1-4603-b970-b59b504531c3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "78596cc8-76e1-4603-b970-b59b504531c3" (UID: "78596cc8-76e1-4603-b970-b59b504531c3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.449200 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78596cc8-76e1-4603-b970-b59b504531c3-inventory" (OuterVolumeSpecName: "inventory") pod "78596cc8-76e1-4603-b970-b59b504531c3" (UID: "78596cc8-76e1-4603-b970-b59b504531c3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.493683 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wss4z\" (UniqueName: \"kubernetes.io/projected/78596cc8-76e1-4603-b970-b59b504531c3-kube-api-access-wss4z\") on node \"crc\" DevicePath \"\"" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.493713 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78596cc8-76e1-4603-b970-b59b504531c3-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.493727 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78596cc8-76e1-4603-b970-b59b504531c3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.770526 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" event={"ID":"78596cc8-76e1-4603-b970-b59b504531c3","Type":"ContainerDied","Data":"ae3addf092c8a903313ae85e71c7591e3cf0838b8c996d5a0231c50c21ad56ac"} Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.770885 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae3addf092c8a903313ae85e71c7591e3cf0838b8c996d5a0231c50c21ad56ac" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.770611 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pmxrf" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.935239 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g"] Mar 20 15:23:22 crc kubenswrapper[4764]: E0320 15:23:22.935734 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78596cc8-76e1-4603-b970-b59b504531c3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.935755 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="78596cc8-76e1-4603-b970-b59b504531c3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.935958 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="78596cc8-76e1-4603-b970-b59b504531c3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.936999 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.941210 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.941362 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.941539 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.941698 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:23:22 crc kubenswrapper[4764]: I0320 15:23:22.944744 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g"] Mar 20 15:23:23 crc kubenswrapper[4764]: I0320 15:23:23.103292 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/580850a5-6a99-4108-aeb7-df44798943e8-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g\" (UID: \"580850a5-6a99-4108-aeb7-df44798943e8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:23:23 crc kubenswrapper[4764]: I0320 15:23:23.103651 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/580850a5-6a99-4108-aeb7-df44798943e8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g\" (UID: \"580850a5-6a99-4108-aeb7-df44798943e8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:23:23 crc kubenswrapper[4764]: I0320 15:23:23.103903 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw6vd\" (UniqueName: \"kubernetes.io/projected/580850a5-6a99-4108-aeb7-df44798943e8-kube-api-access-tw6vd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g\" (UID: \"580850a5-6a99-4108-aeb7-df44798943e8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:23:23 crc kubenswrapper[4764]: I0320 15:23:23.205237 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/580850a5-6a99-4108-aeb7-df44798943e8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g\" (UID: \"580850a5-6a99-4108-aeb7-df44798943e8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:23:23 crc kubenswrapper[4764]: I0320 15:23:23.205589 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw6vd\" (UniqueName: \"kubernetes.io/projected/580850a5-6a99-4108-aeb7-df44798943e8-kube-api-access-tw6vd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g\" (UID: \"580850a5-6a99-4108-aeb7-df44798943e8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:23:23 crc kubenswrapper[4764]: I0320 15:23:23.205809 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/580850a5-6a99-4108-aeb7-df44798943e8-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g\" (UID: \"580850a5-6a99-4108-aeb7-df44798943e8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:23:23 crc kubenswrapper[4764]: I0320 15:23:23.210369 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/580850a5-6a99-4108-aeb7-df44798943e8-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g\" (UID: \"580850a5-6a99-4108-aeb7-df44798943e8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:23:23 crc kubenswrapper[4764]: I0320 15:23:23.210677 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/580850a5-6a99-4108-aeb7-df44798943e8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g\" (UID: \"580850a5-6a99-4108-aeb7-df44798943e8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:23:23 crc kubenswrapper[4764]: I0320 15:23:23.233526 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw6vd\" (UniqueName: \"kubernetes.io/projected/580850a5-6a99-4108-aeb7-df44798943e8-kube-api-access-tw6vd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g\" (UID: \"580850a5-6a99-4108-aeb7-df44798943e8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:23:23 crc kubenswrapper[4764]: I0320 15:23:23.258008 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:23:23 crc kubenswrapper[4764]: I0320 15:23:23.871940 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g"] Mar 20 15:23:24 crc kubenswrapper[4764]: I0320 15:23:24.792967 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" event={"ID":"580850a5-6a99-4108-aeb7-df44798943e8","Type":"ContainerStarted","Data":"e27922abbee35f351506cabfa38022f9418e21504146ee014f2969eafee10e72"} Mar 20 15:23:24 crc kubenswrapper[4764]: I0320 15:23:24.793270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" event={"ID":"580850a5-6a99-4108-aeb7-df44798943e8","Type":"ContainerStarted","Data":"eb44d75d92a7196574fa7efa26e607193b7581996d4c42d51cc4a912af9ff333"} Mar 20 15:23:24 crc kubenswrapper[4764]: I0320 15:23:24.811019 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" podStartSLOduration=2.39726802 podStartE2EDuration="2.810998077s" podCreationTimestamp="2026-03-20 15:23:22 +0000 UTC" firstStartedPulling="2026-03-20 15:23:23.867753691 +0000 UTC m=+1925.483942840" lastFinishedPulling="2026-03-20 15:23:24.281483768 +0000 UTC m=+1925.897672897" observedRunningTime="2026-03-20 15:23:24.810213213 +0000 UTC m=+1926.426402342" watchObservedRunningTime="2026-03-20 15:23:24.810998077 +0000 UTC m=+1926.427187206" Mar 20 15:23:34 crc kubenswrapper[4764]: I0320 15:23:34.047513 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x4dvd"] Mar 20 15:23:34 crc kubenswrapper[4764]: I0320 15:23:34.057918 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-x4dvd"] Mar 20 15:23:35 crc kubenswrapper[4764]: I0320 15:23:35.166426 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1fa9963-3520-46b8-a13f-aa11d6059432" path="/var/lib/kubelet/pods/e1fa9963-3520-46b8-a13f-aa11d6059432/volumes" Mar 20 15:23:56 crc kubenswrapper[4764]: I0320 15:23:56.058924 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-cr8wd"] Mar 20 15:23:56 crc kubenswrapper[4764]: I0320 15:23:56.074188 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-cr8wd"] Mar 20 15:23:57 crc kubenswrapper[4764]: I0320 15:23:57.038472 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k8k8m"] Mar 20 15:23:57 crc kubenswrapper[4764]: I0320 15:23:57.045803 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-k8k8m"] Mar 20 15:23:57 crc kubenswrapper[4764]: I0320 15:23:57.136255 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a21459-b477-4b13-847f-4997f3c4529f" path="/var/lib/kubelet/pods/15a21459-b477-4b13-847f-4997f3c4529f/volumes" Mar 20 15:23:57 crc kubenswrapper[4764]: I0320 15:23:57.137077 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a4d6564-31b4-4743-8acd-d1a431370201" path="/var/lib/kubelet/pods/9a4d6564-31b4-4743-8acd-d1a431370201/volumes" Mar 20 15:24:00 crc kubenswrapper[4764]: I0320 15:24:00.163087 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567004-26g8t"] Mar 20 15:24:00 crc kubenswrapper[4764]: I0320 15:24:00.165311 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567004-26g8t" Mar 20 15:24:00 crc kubenswrapper[4764]: I0320 15:24:00.168282 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:24:00 crc kubenswrapper[4764]: I0320 15:24:00.168675 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:24:00 crc kubenswrapper[4764]: I0320 15:24:00.169159 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:24:00 crc kubenswrapper[4764]: I0320 15:24:00.173884 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567004-26g8t"] Mar 20 15:24:00 crc kubenswrapper[4764]: I0320 15:24:00.321372 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9zdp\" (UniqueName: \"kubernetes.io/projected/5b7fc759-c1c6-46ec-ab74-fb9f2546f661-kube-api-access-h9zdp\") pod \"auto-csr-approver-29567004-26g8t\" (UID: \"5b7fc759-c1c6-46ec-ab74-fb9f2546f661\") " pod="openshift-infra/auto-csr-approver-29567004-26g8t" Mar 20 15:24:00 crc kubenswrapper[4764]: I0320 15:24:00.423326 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9zdp\" (UniqueName: \"kubernetes.io/projected/5b7fc759-c1c6-46ec-ab74-fb9f2546f661-kube-api-access-h9zdp\") pod \"auto-csr-approver-29567004-26g8t\" (UID: \"5b7fc759-c1c6-46ec-ab74-fb9f2546f661\") " pod="openshift-infra/auto-csr-approver-29567004-26g8t" Mar 20 15:24:00 crc kubenswrapper[4764]: I0320 15:24:00.451942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9zdp\" (UniqueName: \"kubernetes.io/projected/5b7fc759-c1c6-46ec-ab74-fb9f2546f661-kube-api-access-h9zdp\") pod \"auto-csr-approver-29567004-26g8t\" (UID: \"5b7fc759-c1c6-46ec-ab74-fb9f2546f661\") " pod="openshift-infra/auto-csr-approver-29567004-26g8t" Mar 20 15:24:00 crc kubenswrapper[4764]: I0320 15:24:00.489956 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567004-26g8t" Mar 20 15:24:01 crc kubenswrapper[4764]: I0320 15:24:01.021587 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567004-26g8t"] Mar 20 15:24:01 crc kubenswrapper[4764]: I0320 15:24:01.166107 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567004-26g8t" event={"ID":"5b7fc759-c1c6-46ec-ab74-fb9f2546f661","Type":"ContainerStarted","Data":"4a541935ac973dec6c1b4acc368e3022f2127c41890c99b2b7e36deb5a2ecb33"} Mar 20 15:24:03 crc kubenswrapper[4764]: I0320 15:24:03.195912 4764 generic.go:334] "Generic (PLEG): container finished" podID="5b7fc759-c1c6-46ec-ab74-fb9f2546f661" containerID="5b9c2245ce56a346a03bb7d2c359fcf5d89f12ee63b70f87c29dcb9a1fd9e7f8" exitCode=0 Mar 20 15:24:03 crc kubenswrapper[4764]: I0320 15:24:03.196016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567004-26g8t" event={"ID":"5b7fc759-c1c6-46ec-ab74-fb9f2546f661","Type":"ContainerDied","Data":"5b9c2245ce56a346a03bb7d2c359fcf5d89f12ee63b70f87c29dcb9a1fd9e7f8"} Mar 20 15:24:04 crc kubenswrapper[4764]: I0320 15:24:04.548398 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567004-26g8t" Mar 20 15:24:04 crc kubenswrapper[4764]: I0320 15:24:04.723284 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9zdp\" (UniqueName: \"kubernetes.io/projected/5b7fc759-c1c6-46ec-ab74-fb9f2546f661-kube-api-access-h9zdp\") pod \"5b7fc759-c1c6-46ec-ab74-fb9f2546f661\" (UID: \"5b7fc759-c1c6-46ec-ab74-fb9f2546f661\") " Mar 20 15:24:04 crc kubenswrapper[4764]: I0320 15:24:04.728607 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b7fc759-c1c6-46ec-ab74-fb9f2546f661-kube-api-access-h9zdp" (OuterVolumeSpecName: "kube-api-access-h9zdp") pod "5b7fc759-c1c6-46ec-ab74-fb9f2546f661" (UID: "5b7fc759-c1c6-46ec-ab74-fb9f2546f661"). InnerVolumeSpecName "kube-api-access-h9zdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:04 crc kubenswrapper[4764]: I0320 15:24:04.826416 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9zdp\" (UniqueName: \"kubernetes.io/projected/5b7fc759-c1c6-46ec-ab74-fb9f2546f661-kube-api-access-h9zdp\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:05 crc kubenswrapper[4764]: I0320 15:24:05.230635 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567004-26g8t" event={"ID":"5b7fc759-c1c6-46ec-ab74-fb9f2546f661","Type":"ContainerDied","Data":"4a541935ac973dec6c1b4acc368e3022f2127c41890c99b2b7e36deb5a2ecb33"} Mar 20 15:24:05 crc kubenswrapper[4764]: I0320 15:24:05.230692 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a541935ac973dec6c1b4acc368e3022f2127c41890c99b2b7e36deb5a2ecb33" Mar 20 15:24:05 crc kubenswrapper[4764]: I0320 15:24:05.230772 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567004-26g8t" Mar 20 15:24:05 crc kubenswrapper[4764]: I0320 15:24:05.612814 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566998-669xr"] Mar 20 15:24:05 crc kubenswrapper[4764]: I0320 15:24:05.625492 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566998-669xr"] Mar 20 15:24:07 crc kubenswrapper[4764]: I0320 15:24:07.138676 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67727519-47f8-4167-a718-c140a71b2dae" path="/var/lib/kubelet/pods/67727519-47f8-4167-a718-c140a71b2dae/volumes" Mar 20 15:24:08 crc kubenswrapper[4764]: I0320 15:24:08.665979 4764 scope.go:117] "RemoveContainer" containerID="64406e85d4e06b92e3255fc64afe33bd8499a8cd5401e852428f006ced123ff7" Mar 20 15:24:08 crc kubenswrapper[4764]: I0320 15:24:08.709490 4764 scope.go:117] "RemoveContainer" containerID="f3229830082952e3dd130b86520b41db93101029258d5a49a565a96e4b17ad40" Mar 20 15:24:08 crc kubenswrapper[4764]: I0320 15:24:08.759255 4764 scope.go:117] "RemoveContainer" containerID="76c0c17724d68d2790828c6ae96c078ac5b894f816e6140aae4c872f7b44e425" Mar 20 15:24:08 crc kubenswrapper[4764]: I0320 15:24:08.837678 4764 scope.go:117] "RemoveContainer" containerID="c01781c56e21fce86204109ad8f7345f0237d0a20ea1aca7504d738fa2299089" Mar 20 15:24:08 crc kubenswrapper[4764]: I0320 15:24:08.884928 4764 scope.go:117] "RemoveContainer" containerID="2fbbb178bcbeebed7d961ec717e54209cb5b795217f7bc857740b8d8b736d0c4" Mar 20 15:24:08 crc kubenswrapper[4764]: I0320 15:24:08.907436 4764 scope.go:117] "RemoveContainer" containerID="531116529d796233fe33036f960abaa83b9644b04cd24fd2c622c4c194ccc7b9" Mar 20 15:24:08 crc kubenswrapper[4764]: I0320 15:24:08.955088 4764 scope.go:117] "RemoveContainer" containerID="61a5275ea2051cc4b1f273ae4bfbd7d011e22af1ade96cdafaa4ad36abe697fa" Mar 20 15:24:08 crc kubenswrapper[4764]: I0320 15:24:08.981792 4764 scope.go:117] "RemoveContainer" containerID="348731f5dd646000d50072190415b5151b6adaaa6723bac653816c63f16a0ca0" Mar 20 15:24:09 crc kubenswrapper[4764]: I0320 15:24:09.003327 4764 scope.go:117] "RemoveContainer" containerID="e656f614766f094b6c759eccf2b240cdb5224a737c99af67801a7606c56a3898" Mar 20 15:24:16 crc kubenswrapper[4764]: I0320 15:24:16.328021 4764 generic.go:334] "Generic (PLEG): container finished" podID="580850a5-6a99-4108-aeb7-df44798943e8" containerID="e27922abbee35f351506cabfa38022f9418e21504146ee014f2969eafee10e72" exitCode=0 Mar 20 15:24:16 crc kubenswrapper[4764]: I0320 15:24:16.328139 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" event={"ID":"580850a5-6a99-4108-aeb7-df44798943e8","Type":"ContainerDied","Data":"e27922abbee35f351506cabfa38022f9418e21504146ee014f2969eafee10e72"} Mar 20 15:24:17 crc kubenswrapper[4764]: I0320 15:24:17.796712 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:24:17 crc kubenswrapper[4764]: I0320 15:24:17.894761 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw6vd\" (UniqueName: \"kubernetes.io/projected/580850a5-6a99-4108-aeb7-df44798943e8-kube-api-access-tw6vd\") pod \"580850a5-6a99-4108-aeb7-df44798943e8\" (UID: \"580850a5-6a99-4108-aeb7-df44798943e8\") " Mar 20 15:24:17 crc kubenswrapper[4764]: I0320 15:24:17.895196 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/580850a5-6a99-4108-aeb7-df44798943e8-inventory\") pod \"580850a5-6a99-4108-aeb7-df44798943e8\" (UID: \"580850a5-6a99-4108-aeb7-df44798943e8\") " Mar 20 15:24:17 crc kubenswrapper[4764]: I0320 15:24:17.895231 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/580850a5-6a99-4108-aeb7-df44798943e8-ssh-key-openstack-edpm-ipam\") pod \"580850a5-6a99-4108-aeb7-df44798943e8\" (UID: \"580850a5-6a99-4108-aeb7-df44798943e8\") " Mar 20 15:24:17 crc kubenswrapper[4764]: I0320 15:24:17.907719 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580850a5-6a99-4108-aeb7-df44798943e8-kube-api-access-tw6vd" (OuterVolumeSpecName: "kube-api-access-tw6vd") pod "580850a5-6a99-4108-aeb7-df44798943e8" (UID: "580850a5-6a99-4108-aeb7-df44798943e8"). InnerVolumeSpecName "kube-api-access-tw6vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:17 crc kubenswrapper[4764]: I0320 15:24:17.930596 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580850a5-6a99-4108-aeb7-df44798943e8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "580850a5-6a99-4108-aeb7-df44798943e8" (UID: "580850a5-6a99-4108-aeb7-df44798943e8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:17 crc kubenswrapper[4764]: I0320 15:24:17.947804 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580850a5-6a99-4108-aeb7-df44798943e8-inventory" (OuterVolumeSpecName: "inventory") pod "580850a5-6a99-4108-aeb7-df44798943e8" (UID: "580850a5-6a99-4108-aeb7-df44798943e8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:17 crc kubenswrapper[4764]: I0320 15:24:17.997765 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/580850a5-6a99-4108-aeb7-df44798943e8-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:17 crc kubenswrapper[4764]: I0320 15:24:17.998044 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/580850a5-6a99-4108-aeb7-df44798943e8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:17 crc kubenswrapper[4764]: I0320 15:24:17.998173 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw6vd\" (UniqueName: \"kubernetes.io/projected/580850a5-6a99-4108-aeb7-df44798943e8-kube-api-access-tw6vd\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.352267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" event={"ID":"580850a5-6a99-4108-aeb7-df44798943e8","Type":"ContainerDied","Data":"eb44d75d92a7196574fa7efa26e607193b7581996d4c42d51cc4a912af9ff333"} Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.352305 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb44d75d92a7196574fa7efa26e607193b7581996d4c42d51cc4a912af9ff333" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.352320 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.542645 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xxvwp"] Mar 20 15:24:18 crc kubenswrapper[4764]: E0320 15:24:18.543249 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7fc759-c1c6-46ec-ab74-fb9f2546f661" containerName="oc" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.543271 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7fc759-c1c6-46ec-ab74-fb9f2546f661" containerName="oc" Mar 20 15:24:18 crc kubenswrapper[4764]: E0320 15:24:18.543295 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580850a5-6a99-4108-aeb7-df44798943e8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.543314 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="580850a5-6a99-4108-aeb7-df44798943e8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.543559 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="580850a5-6a99-4108-aeb7-df44798943e8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.543587 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b7fc759-c1c6-46ec-ab74-fb9f2546f661" containerName="oc" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.544442 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.553150 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.553505 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.553839 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.554082 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.561370 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xxvwp"] Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.716160 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6921f642-8ca4-4d60-bd80-9e5db110986f-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xxvwp\" (UID: \"6921f642-8ca4-4d60-bd80-9e5db110986f\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.716255 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrr4q\" (UniqueName: \"kubernetes.io/projected/6921f642-8ca4-4d60-bd80-9e5db110986f-kube-api-access-xrr4q\") pod \"ssh-known-hosts-edpm-deployment-xxvwp\" (UID: \"6921f642-8ca4-4d60-bd80-9e5db110986f\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.716663 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6921f642-8ca4-4d60-bd80-9e5db110986f-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xxvwp\" (UID: \"6921f642-8ca4-4d60-bd80-9e5db110986f\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.818912 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6921f642-8ca4-4d60-bd80-9e5db110986f-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xxvwp\" (UID: \"6921f642-8ca4-4d60-bd80-9e5db110986f\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.818989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6921f642-8ca4-4d60-bd80-9e5db110986f-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xxvwp\" (UID: \"6921f642-8ca4-4d60-bd80-9e5db110986f\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.819023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrr4q\" (UniqueName: \"kubernetes.io/projected/6921f642-8ca4-4d60-bd80-9e5db110986f-kube-api-access-xrr4q\") pod \"ssh-known-hosts-edpm-deployment-xxvwp\" (UID: \"6921f642-8ca4-4d60-bd80-9e5db110986f\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.824617 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6921f642-8ca4-4d60-bd80-9e5db110986f-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xxvwp\" (UID: \"6921f642-8ca4-4d60-bd80-9e5db110986f\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.824729 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6921f642-8ca4-4d60-bd80-9e5db110986f-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xxvwp\" (UID: \"6921f642-8ca4-4d60-bd80-9e5db110986f\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.853517 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrr4q\" (UniqueName: \"kubernetes.io/projected/6921f642-8ca4-4d60-bd80-9e5db110986f-kube-api-access-xrr4q\") pod \"ssh-known-hosts-edpm-deployment-xxvwp\" (UID: \"6921f642-8ca4-4d60-bd80-9e5db110986f\") " pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:18 crc kubenswrapper[4764]: I0320 15:24:18.870250 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:19 crc kubenswrapper[4764]: I0320 15:24:19.432754 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xxvwp"] Mar 20 15:24:20 crc kubenswrapper[4764]: I0320 15:24:20.002306 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:24:20 crc kubenswrapper[4764]: I0320 15:24:20.384172 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" event={"ID":"6921f642-8ca4-4d60-bd80-9e5db110986f","Type":"ContainerStarted","Data":"1869147ad538cb17fc5f227b7559ad651cf85802a3be6f1ba4ce3c083e169b3a"} Mar 20 15:24:20 crc kubenswrapper[4764]: I0320 15:24:20.384758 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" event={"ID":"6921f642-8ca4-4d60-bd80-9e5db110986f","Type":"ContainerStarted","Data":"23e86bb09f5be3eb96ef1a0ce411b86dbd847ab8ceb32f5df60a82aac3105e48"} Mar 20 15:24:20 crc kubenswrapper[4764]: I0320 15:24:20.406124 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" podStartSLOduration=1.845862537 podStartE2EDuration="2.40610581s" podCreationTimestamp="2026-03-20 15:24:18 +0000 UTC" firstStartedPulling="2026-03-20 15:24:19.438268189 +0000 UTC m=+1981.054457318" lastFinishedPulling="2026-03-20 15:24:19.998511422 +0000 UTC m=+1981.614700591" observedRunningTime="2026-03-20 15:24:20.395674619 +0000 UTC m=+1982.011863738" watchObservedRunningTime="2026-03-20 15:24:20.40610581 +0000 UTC m=+1982.022294939" Mar 20 15:24:27 crc kubenswrapper[4764]: I0320 15:24:27.455716 4764 generic.go:334] "Generic (PLEG): container finished" podID="6921f642-8ca4-4d60-bd80-9e5db110986f" containerID="1869147ad538cb17fc5f227b7559ad651cf85802a3be6f1ba4ce3c083e169b3a" exitCode=0 Mar 20 15:24:27 crc kubenswrapper[4764]: I0320 15:24:27.455835 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" event={"ID":"6921f642-8ca4-4d60-bd80-9e5db110986f","Type":"ContainerDied","Data":"1869147ad538cb17fc5f227b7559ad651cf85802a3be6f1ba4ce3c083e169b3a"} Mar 20 15:24:28 crc kubenswrapper[4764]: I0320 15:24:28.882021 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.034946 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6921f642-8ca4-4d60-bd80-9e5db110986f-inventory-0\") pod \"6921f642-8ca4-4d60-bd80-9e5db110986f\" (UID: \"6921f642-8ca4-4d60-bd80-9e5db110986f\") " Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.035081 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6921f642-8ca4-4d60-bd80-9e5db110986f-ssh-key-openstack-edpm-ipam\") pod \"6921f642-8ca4-4d60-bd80-9e5db110986f\" (UID: \"6921f642-8ca4-4d60-bd80-9e5db110986f\") " Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.035191 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrr4q\" (UniqueName: \"kubernetes.io/projected/6921f642-8ca4-4d60-bd80-9e5db110986f-kube-api-access-xrr4q\") pod \"6921f642-8ca4-4d60-bd80-9e5db110986f\" (UID: \"6921f642-8ca4-4d60-bd80-9e5db110986f\") " Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.049757 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6921f642-8ca4-4d60-bd80-9e5db110986f-kube-api-access-xrr4q" (OuterVolumeSpecName: "kube-api-access-xrr4q") pod "6921f642-8ca4-4d60-bd80-9e5db110986f" (UID: "6921f642-8ca4-4d60-bd80-9e5db110986f"). InnerVolumeSpecName "kube-api-access-xrr4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.080764 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6921f642-8ca4-4d60-bd80-9e5db110986f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6921f642-8ca4-4d60-bd80-9e5db110986f" (UID: "6921f642-8ca4-4d60-bd80-9e5db110986f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.081504 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6921f642-8ca4-4d60-bd80-9e5db110986f-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "6921f642-8ca4-4d60-bd80-9e5db110986f" (UID: "6921f642-8ca4-4d60-bd80-9e5db110986f"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.149098 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6921f642-8ca4-4d60-bd80-9e5db110986f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.149300 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrr4q\" (UniqueName: \"kubernetes.io/projected/6921f642-8ca4-4d60-bd80-9e5db110986f-kube-api-access-xrr4q\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.149311 4764 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/6921f642-8ca4-4d60-bd80-9e5db110986f-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.480594 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" event={"ID":"6921f642-8ca4-4d60-bd80-9e5db110986f","Type":"ContainerDied","Data":"23e86bb09f5be3eb96ef1a0ce411b86dbd847ab8ceb32f5df60a82aac3105e48"} Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.480654 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23e86bb09f5be3eb96ef1a0ce411b86dbd847ab8ceb32f5df60a82aac3105e48" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.480687 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xxvwp" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.615766 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg"] Mar 20 15:24:29 crc kubenswrapper[4764]: E0320 15:24:29.616094 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6921f642-8ca4-4d60-bd80-9e5db110986f" containerName="ssh-known-hosts-edpm-deployment" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.616109 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6921f642-8ca4-4d60-bd80-9e5db110986f" containerName="ssh-known-hosts-edpm-deployment" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.616303 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6921f642-8ca4-4d60-bd80-9e5db110986f" containerName="ssh-known-hosts-edpm-deployment" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.616883 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.618855 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.619237 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.619736 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.624146 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.631297 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg"] Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.763092 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e86c9d2-c700-4e8a-aec2-d808fe36be79-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2x6qg\" (UID: \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.763143 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z658r\" (UniqueName: \"kubernetes.io/projected/0e86c9d2-c700-4e8a-aec2-d808fe36be79-kube-api-access-z658r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2x6qg\" (UID: \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.763259 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e86c9d2-c700-4e8a-aec2-d808fe36be79-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2x6qg\" (UID: \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.865642 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e86c9d2-c700-4e8a-aec2-d808fe36be79-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2x6qg\" (UID: \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.865727 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e86c9d2-c700-4e8a-aec2-d808fe36be79-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2x6qg\" (UID: \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.865760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z658r\" (UniqueName: \"kubernetes.io/projected/0e86c9d2-c700-4e8a-aec2-d808fe36be79-kube-api-access-z658r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2x6qg\" (UID: \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.869315 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e86c9d2-c700-4e8a-aec2-d808fe36be79-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2x6qg\" (UID: \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.874797 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e86c9d2-c700-4e8a-aec2-d808fe36be79-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2x6qg\" (UID: \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.888316 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z658r\" (UniqueName: \"kubernetes.io/projected/0e86c9d2-c700-4e8a-aec2-d808fe36be79-kube-api-access-z658r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2x6qg\" (UID: \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:29 crc kubenswrapper[4764]: I0320 15:24:29.932102 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:30 crc kubenswrapper[4764]: I0320 15:24:30.491315 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg"] Mar 20 15:24:31 crc kubenswrapper[4764]: I0320 15:24:31.520975 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" event={"ID":"0e86c9d2-c700-4e8a-aec2-d808fe36be79","Type":"ContainerStarted","Data":"3020d571ee026ae1a10d1e599412b89ad7cadfee0fff772085ad1d4afc31842f"} Mar 20 15:24:31 crc kubenswrapper[4764]: I0320 15:24:31.521395 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" event={"ID":"0e86c9d2-c700-4e8a-aec2-d808fe36be79","Type":"ContainerStarted","Data":"14bc37253462532f0b0c15c72702a8dcaa7dd16a5848247661bc9eb6a9b4c6f3"} Mar 20 15:24:31 crc kubenswrapper[4764]: I0320 15:24:31.550306 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" podStartSLOduration=2.088627869 podStartE2EDuration="2.550290041s" podCreationTimestamp="2026-03-20 15:24:29 +0000 UTC" firstStartedPulling="2026-03-20 15:24:30.502030816 +0000 UTC m=+1992.118219945" lastFinishedPulling="2026-03-20 15:24:30.963692988 +0000 UTC m=+1992.579882117" observedRunningTime="2026-03-20 15:24:31.549812986 +0000 UTC m=+1993.166002125" watchObservedRunningTime="2026-03-20 15:24:31.550290041 +0000 UTC m=+1993.166479170" Mar 20 15:24:39 crc kubenswrapper[4764]: I0320 15:24:39.603081 4764 generic.go:334] "Generic (PLEG): container finished" podID="0e86c9d2-c700-4e8a-aec2-d808fe36be79" containerID="3020d571ee026ae1a10d1e599412b89ad7cadfee0fff772085ad1d4afc31842f" exitCode=0 Mar 20 15:24:39 crc kubenswrapper[4764]: I0320 15:24:39.603247 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" event={"ID":"0e86c9d2-c700-4e8a-aec2-d808fe36be79","Type":"ContainerDied","Data":"3020d571ee026ae1a10d1e599412b89ad7cadfee0fff772085ad1d4afc31842f"} Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.042026 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-9x8xg"] Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.048727 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.056908 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-9x8xg"] Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.157480 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a62396d5-6708-4d82-863d-5c4a7613290d" path="/var/lib/kubelet/pods/a62396d5-6708-4d82-863d-5c4a7613290d/volumes" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.201084 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z658r\" (UniqueName: \"kubernetes.io/projected/0e86c9d2-c700-4e8a-aec2-d808fe36be79-kube-api-access-z658r\") pod \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\" (UID: \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\") " Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.201165 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e86c9d2-c700-4e8a-aec2-d808fe36be79-ssh-key-openstack-edpm-ipam\") pod \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\" (UID: \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\") " Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.201500 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e86c9d2-c700-4e8a-aec2-d808fe36be79-inventory\") pod \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\" (UID: \"0e86c9d2-c700-4e8a-aec2-d808fe36be79\") " Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.207060 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e86c9d2-c700-4e8a-aec2-d808fe36be79-kube-api-access-z658r" (OuterVolumeSpecName: "kube-api-access-z658r") pod "0e86c9d2-c700-4e8a-aec2-d808fe36be79" (UID: "0e86c9d2-c700-4e8a-aec2-d808fe36be79"). InnerVolumeSpecName "kube-api-access-z658r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.229125 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e86c9d2-c700-4e8a-aec2-d808fe36be79-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0e86c9d2-c700-4e8a-aec2-d808fe36be79" (UID: "0e86c9d2-c700-4e8a-aec2-d808fe36be79"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.243133 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e86c9d2-c700-4e8a-aec2-d808fe36be79-inventory" (OuterVolumeSpecName: "inventory") pod "0e86c9d2-c700-4e8a-aec2-d808fe36be79" (UID: "0e86c9d2-c700-4e8a-aec2-d808fe36be79"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.304616 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e86c9d2-c700-4e8a-aec2-d808fe36be79-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.304683 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e86c9d2-c700-4e8a-aec2-d808fe36be79-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.304714 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z658r\" (UniqueName: \"kubernetes.io/projected/0e86c9d2-c700-4e8a-aec2-d808fe36be79-kube-api-access-z658r\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.628021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" event={"ID":"0e86c9d2-c700-4e8a-aec2-d808fe36be79","Type":"ContainerDied","Data":"14bc37253462532f0b0c15c72702a8dcaa7dd16a5848247661bc9eb6a9b4c6f3"} Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.628417 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14bc37253462532f0b0c15c72702a8dcaa7dd16a5848247661bc9eb6a9b4c6f3" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.628103 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2x6qg" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.733046 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6"] Mar 20 15:24:41 crc kubenswrapper[4764]: E0320 15:24:41.733457 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e86c9d2-c700-4e8a-aec2-d808fe36be79" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.733471 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e86c9d2-c700-4e8a-aec2-d808fe36be79" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.733631 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e86c9d2-c700-4e8a-aec2-d808fe36be79" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.734195 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.736656 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.736761 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.742287 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.742418 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.765508 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6"] Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.815335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b67f53a-96b2-487b-b68e-60560ba40a02-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6\" (UID: \"0b67f53a-96b2-487b-b68e-60560ba40a02\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.815435 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s578w\" (UniqueName: \"kubernetes.io/projected/0b67f53a-96b2-487b-b68e-60560ba40a02-kube-api-access-s578w\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6\" (UID: \"0b67f53a-96b2-487b-b68e-60560ba40a02\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.815841 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b67f53a-96b2-487b-b68e-60560ba40a02-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6\" (UID: \"0b67f53a-96b2-487b-b68e-60560ba40a02\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.917630 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b67f53a-96b2-487b-b68e-60560ba40a02-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6\" (UID: \"0b67f53a-96b2-487b-b68e-60560ba40a02\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.917762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b67f53a-96b2-487b-b68e-60560ba40a02-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6\" (UID: \"0b67f53a-96b2-487b-b68e-60560ba40a02\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.917875 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s578w\" (UniqueName: \"kubernetes.io/projected/0b67f53a-96b2-487b-b68e-60560ba40a02-kube-api-access-s578w\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6\" (UID: \"0b67f53a-96b2-487b-b68e-60560ba40a02\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.921928 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b67f53a-96b2-487b-b68e-60560ba40a02-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6\" (UID: \"0b67f53a-96b2-487b-b68e-60560ba40a02\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.922011 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b67f53a-96b2-487b-b68e-60560ba40a02-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6\" (UID: \"0b67f53a-96b2-487b-b68e-60560ba40a02\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:41 crc kubenswrapper[4764]: I0320 15:24:41.935059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s578w\" (UniqueName: \"kubernetes.io/projected/0b67f53a-96b2-487b-b68e-60560ba40a02-kube-api-access-s578w\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6\" (UID: \"0b67f53a-96b2-487b-b68e-60560ba40a02\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:42 crc kubenswrapper[4764]: I0320 15:24:42.055328 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:42 crc kubenswrapper[4764]: I0320 15:24:42.631135 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6"] Mar 20 15:24:42 crc kubenswrapper[4764]: I0320 15:24:42.638856 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" event={"ID":"0b67f53a-96b2-487b-b68e-60560ba40a02","Type":"ContainerStarted","Data":"4ecbad3c2d685a38b790c252d1986489684cd539b240aa9408f94011363d9e28"} Mar 20 15:24:43 crc kubenswrapper[4764]: I0320 15:24:43.648924 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" event={"ID":"0b67f53a-96b2-487b-b68e-60560ba40a02","Type":"ContainerStarted","Data":"70de06ecff5d3ae2ea258e9b1856c9aec12289016a0d7c99f314696ffe0f4436"} Mar 20 15:24:43 crc kubenswrapper[4764]: I0320 15:24:43.675476 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" podStartSLOduration=2.147685513 podStartE2EDuration="2.675457716s" podCreationTimestamp="2026-03-20 15:24:41 +0000 UTC" firstStartedPulling="2026-03-20 15:24:42.629164073 +0000 UTC m=+2004.245353202" lastFinishedPulling="2026-03-20 15:24:43.156936276 +0000 UTC m=+2004.773125405" observedRunningTime="2026-03-20 15:24:43.670821314 +0000 UTC m=+2005.287010443" watchObservedRunningTime="2026-03-20 15:24:43.675457716 +0000 UTC m=+2005.291646845" Mar 20 15:24:52 crc kubenswrapper[4764]: I0320 15:24:52.731929 4764 generic.go:334] "Generic (PLEG): container finished" podID="0b67f53a-96b2-487b-b68e-60560ba40a02" containerID="70de06ecff5d3ae2ea258e9b1856c9aec12289016a0d7c99f314696ffe0f4436" exitCode=0 Mar 20 15:24:52 crc kubenswrapper[4764]: I0320 15:24:52.732033 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" event={"ID":"0b67f53a-96b2-487b-b68e-60560ba40a02","Type":"ContainerDied","Data":"70de06ecff5d3ae2ea258e9b1856c9aec12289016a0d7c99f314696ffe0f4436"} Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.620359 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.692845 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b67f53a-96b2-487b-b68e-60560ba40a02-inventory\") pod \"0b67f53a-96b2-487b-b68e-60560ba40a02\" (UID: \"0b67f53a-96b2-487b-b68e-60560ba40a02\") " Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.693049 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b67f53a-96b2-487b-b68e-60560ba40a02-ssh-key-openstack-edpm-ipam\") pod \"0b67f53a-96b2-487b-b68e-60560ba40a02\" (UID: \"0b67f53a-96b2-487b-b68e-60560ba40a02\") " Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.693141 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s578w\" (UniqueName: \"kubernetes.io/projected/0b67f53a-96b2-487b-b68e-60560ba40a02-kube-api-access-s578w\") pod \"0b67f53a-96b2-487b-b68e-60560ba40a02\" (UID: \"0b67f53a-96b2-487b-b68e-60560ba40a02\") " Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.699044 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b67f53a-96b2-487b-b68e-60560ba40a02-kube-api-access-s578w" (OuterVolumeSpecName: "kube-api-access-s578w") pod "0b67f53a-96b2-487b-b68e-60560ba40a02" (UID: "0b67f53a-96b2-487b-b68e-60560ba40a02"). InnerVolumeSpecName "kube-api-access-s578w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.720318 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b67f53a-96b2-487b-b68e-60560ba40a02-inventory" (OuterVolumeSpecName: "inventory") pod "0b67f53a-96b2-487b-b68e-60560ba40a02" (UID: "0b67f53a-96b2-487b-b68e-60560ba40a02"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.721100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b67f53a-96b2-487b-b68e-60560ba40a02-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0b67f53a-96b2-487b-b68e-60560ba40a02" (UID: "0b67f53a-96b2-487b-b68e-60560ba40a02"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.758923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" event={"ID":"0b67f53a-96b2-487b-b68e-60560ba40a02","Type":"ContainerDied","Data":"4ecbad3c2d685a38b790c252d1986489684cd539b240aa9408f94011363d9e28"} Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.759271 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ecbad3c2d685a38b790c252d1986489684cd539b240aa9408f94011363d9e28" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.758995 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.795744 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b67f53a-96b2-487b-b68e-60560ba40a02-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.795772 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s578w\" (UniqueName: \"kubernetes.io/projected/0b67f53a-96b2-487b-b68e-60560ba40a02-kube-api-access-s578w\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.795916 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b67f53a-96b2-487b-b68e-60560ba40a02-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.831039 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q"] Mar 20 15:24:54 crc kubenswrapper[4764]: E0320 15:24:54.832046 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b67f53a-96b2-487b-b68e-60560ba40a02" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.832098 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b67f53a-96b2-487b-b68e-60560ba40a02" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.832612 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b67f53a-96b2-487b-b68e-60560ba40a02" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.833532 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.835769 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.836036 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.836197 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.836455 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.836826 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.837121 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.837297 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.841692 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.863739 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q"] Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.898170 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.898232 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.898284 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.898411 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.898509 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.898651 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.898783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.898840 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.898868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.898937 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.898967 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.898990 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bltz\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-kube-api-access-4bltz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.899035 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:54 crc kubenswrapper[4764]: I0320 15:24:54.899100 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001223 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001309 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001423 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001470 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001510 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001573 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001611 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001648 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bltz\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-kube-api-access-4bltz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001697 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001834 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001931 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.001991 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.006417 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.008292 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.008754 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.008893 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.009332 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.009693 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.010886 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.011347 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.012572 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.013591 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.013722 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.014518 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.019209 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bltz\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-kube-api-access-4bltz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.019300 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.159764 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.517920 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q"] Mar 20 15:24:55 crc kubenswrapper[4764]: I0320 15:24:55.769070 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" event={"ID":"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966","Type":"ContainerStarted","Data":"acc403c654a749ef5a3c26af1407032ee7b19797365d75fbb18561fa32a5b4a0"} Mar 20 15:24:56 crc kubenswrapper[4764]: I0320 15:24:56.778202 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" event={"ID":"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966","Type":"ContainerStarted","Data":"8cd6798825a80fdb1591af751f1495ede3d88a0b08ac81a53d9a5ff1f2999ef0"} Mar 20 15:25:08 crc kubenswrapper[4764]: I0320 15:25:08.443236 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:25:08 crc kubenswrapper[4764]: I0320 15:25:08.443758 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:25:09 crc kubenswrapper[4764]: I0320 15:25:09.219480 4764 scope.go:117] "RemoveContainer" containerID="265f50780edff0cad1f2ae2abc645b4b861d39b8ead249186613c401345e77a1" Mar 20 15:25:35 crc kubenswrapper[4764]: I0320 15:25:35.181892 4764 generic.go:334] "Generic (PLEG): container finished" podID="5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" containerID="8cd6798825a80fdb1591af751f1495ede3d88a0b08ac81a53d9a5ff1f2999ef0" exitCode=0 Mar 20 15:25:35 crc kubenswrapper[4764]: I0320 15:25:35.182550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" event={"ID":"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966","Type":"ContainerDied","Data":"8cd6798825a80fdb1591af751f1495ede3d88a0b08ac81a53d9a5ff1f2999ef0"} Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.629140 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706122 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-repo-setup-combined-ca-bundle\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706255 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706347 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-inventory\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706408 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-nova-combined-ca-bundle\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706435 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-bootstrap-combined-ca-bundle\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706463 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-ssh-key-openstack-edpm-ipam\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706516 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706563 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-telemetry-combined-ca-bundle\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706623 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706649 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bltz\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-kube-api-access-4bltz\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706697 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-ovn-combined-ca-bundle\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706730 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-ovn-default-certs-0\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706771 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-libvirt-combined-ca-bundle\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.706795 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-neutron-metadata-combined-ca-bundle\") pod \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\" (UID: \"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966\") " Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.714378 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.715000 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.715027 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.715714 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.717420 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.718371 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.719201 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.721860 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-kube-api-access-4bltz" (OuterVolumeSpecName: "kube-api-access-4bltz") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "kube-api-access-4bltz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.722629 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.723010 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.725591 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.725749 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.752882 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-inventory" (OuterVolumeSpecName: "inventory") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.763722 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" (UID: "5201bbcb-c3f8-4a8d-81be-b4eaf33cc966"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809510 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809555 4764 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809570 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809587 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809601 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809614 4764 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809624 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809634 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bltz\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-kube-api-access-4bltz\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809642 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809653 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809663 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809671 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809679 4764 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:36 crc kubenswrapper[4764]: I0320 15:25:36.809688 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/5201bbcb-c3f8-4a8d-81be-b4eaf33cc966-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.202345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" event={"ID":"5201bbcb-c3f8-4a8d-81be-b4eaf33cc966","Type":"ContainerDied","Data":"acc403c654a749ef5a3c26af1407032ee7b19797365d75fbb18561fa32a5b4a0"} Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.202673 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acc403c654a749ef5a3c26af1407032ee7b19797365d75fbb18561fa32a5b4a0" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.202410 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.349622 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk"] Mar 20 15:25:37 crc kubenswrapper[4764]: E0320 15:25:37.350047 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.350064 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.350282 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5201bbcb-c3f8-4a8d-81be-b4eaf33cc966" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.351075 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.354079 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.354106 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.354824 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.354842 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.355136 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.359838 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk"] Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.421164 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8hmr\" (UniqueName: \"kubernetes.io/projected/28a506a3-463d-4bc4-ab93-2e8201878e60-kube-api-access-v8hmr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.421446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28a506a3-463d-4bc4-ab93-2e8201878e60-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.421598 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.421735 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.421887 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.523769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.523863 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.523995 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8hmr\" (UniqueName: \"kubernetes.io/projected/28a506a3-463d-4bc4-ab93-2e8201878e60-kube-api-access-v8hmr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.524024 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28a506a3-463d-4bc4-ab93-2e8201878e60-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.524155 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.525443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28a506a3-463d-4bc4-ab93-2e8201878e60-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.528302 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.529236 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.530264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.541317 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8hmr\" (UniqueName: \"kubernetes.io/projected/28a506a3-463d-4bc4-ab93-2e8201878e60-kube-api-access-v8hmr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v2tgk\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:37 crc kubenswrapper[4764]: I0320 15:25:37.697361 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:25:38 crc kubenswrapper[4764]: I0320 15:25:38.238332 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk"] Mar 20 15:25:38 crc kubenswrapper[4764]: I0320 15:25:38.444021 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:25:38 crc kubenswrapper[4764]: I0320 15:25:38.444082 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:25:39 crc kubenswrapper[4764]: I0320 15:25:39.222902 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" event={"ID":"28a506a3-463d-4bc4-ab93-2e8201878e60","Type":"ContainerStarted","Data":"d4ec91c715574acc5ffce7b60a5a0007f0f781b3fd3b641b3940ae6b4ea7a54c"} Mar 20 15:25:39 crc kubenswrapper[4764]: I0320 15:25:39.223272 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" event={"ID":"28a506a3-463d-4bc4-ab93-2e8201878e60","Type":"ContainerStarted","Data":"1feaa14a733dfb74b7b59b38c47d314111dc06dbeffd7ba96563f772d75dfd14"} Mar 20 15:25:39 crc kubenswrapper[4764]: I0320 15:25:39.246002 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" podStartSLOduration=1.732589381 podStartE2EDuration="2.245973443s" podCreationTimestamp="2026-03-20 15:25:37 +0000 UTC" firstStartedPulling="2026-03-20 15:25:38.254459254 +0000 UTC m=+2059.870648413" lastFinishedPulling="2026-03-20 15:25:38.767843346 +0000 UTC m=+2060.384032475" observedRunningTime="2026-03-20 15:25:39.237239814 +0000 UTC m=+2060.853428943" watchObservedRunningTime="2026-03-20 15:25:39.245973443 +0000 UTC m=+2060.862162602" Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.613824 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qxt87"] Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.618904 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.632209 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qxt87"] Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.701883 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4tpc\" (UniqueName: \"kubernetes.io/projected/ef036762-4081-4a2c-a8f7-249b11310308-kube-api-access-p4tpc\") pod \"redhat-marketplace-qxt87\" (UID: \"ef036762-4081-4a2c-a8f7-249b11310308\") " pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.702085 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef036762-4081-4a2c-a8f7-249b11310308-utilities\") pod \"redhat-marketplace-qxt87\" (UID: \"ef036762-4081-4a2c-a8f7-249b11310308\") " pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.702207 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef036762-4081-4a2c-a8f7-249b11310308-catalog-content\") pod \"redhat-marketplace-qxt87\" (UID: \"ef036762-4081-4a2c-a8f7-249b11310308\") " pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.804144 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4tpc\" (UniqueName: \"kubernetes.io/projected/ef036762-4081-4a2c-a8f7-249b11310308-kube-api-access-p4tpc\") pod \"redhat-marketplace-qxt87\" (UID: \"ef036762-4081-4a2c-a8f7-249b11310308\") " pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.804262 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef036762-4081-4a2c-a8f7-249b11310308-utilities\") pod \"redhat-marketplace-qxt87\" (UID: \"ef036762-4081-4a2c-a8f7-249b11310308\") " pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.804324 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef036762-4081-4a2c-a8f7-249b11310308-catalog-content\") pod \"redhat-marketplace-qxt87\" (UID: \"ef036762-4081-4a2c-a8f7-249b11310308\") " pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.804921 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef036762-4081-4a2c-a8f7-249b11310308-catalog-content\") pod \"redhat-marketplace-qxt87\" (UID: \"ef036762-4081-4a2c-a8f7-249b11310308\") " pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.807721 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef036762-4081-4a2c-a8f7-249b11310308-utilities\") pod \"redhat-marketplace-qxt87\" (UID: \"ef036762-4081-4a2c-a8f7-249b11310308\") " pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.833476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4tpc\" (UniqueName: \"kubernetes.io/projected/ef036762-4081-4a2c-a8f7-249b11310308-kube-api-access-p4tpc\") pod \"redhat-marketplace-qxt87\" (UID: \"ef036762-4081-4a2c-a8f7-249b11310308\") " pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:48 crc kubenswrapper[4764]: I0320 15:25:48.985878 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:49 crc kubenswrapper[4764]: I0320 15:25:49.424370 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qxt87"] Mar 20 15:25:50 crc kubenswrapper[4764]: I0320 15:25:50.335840 4764 generic.go:334] "Generic (PLEG): container finished" podID="ef036762-4081-4a2c-a8f7-249b11310308" containerID="0ef322b21325bcaee313b7a5cff9c292b61e49e2d14b9245b7af77c306c10385" exitCode=0 Mar 20 15:25:50 crc kubenswrapper[4764]: I0320 15:25:50.335917 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qxt87" event={"ID":"ef036762-4081-4a2c-a8f7-249b11310308","Type":"ContainerDied","Data":"0ef322b21325bcaee313b7a5cff9c292b61e49e2d14b9245b7af77c306c10385"} Mar 20 15:25:50 crc kubenswrapper[4764]: I0320 15:25:50.336769 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qxt87" event={"ID":"ef036762-4081-4a2c-a8f7-249b11310308","Type":"ContainerStarted","Data":"5a2d3bebaa63d0f9a136e7bf79615a29b505d549241ceda54a5059f8cbcc2ea2"} Mar 20 15:25:52 crc kubenswrapper[4764]: I0320 15:25:52.355172 4764 generic.go:334] "Generic (PLEG): container finished" podID="ef036762-4081-4a2c-a8f7-249b11310308" containerID="7c25d612d6eae6751fb9eb18171c3176dee04e0b2d1e36fb6e89145c045881da" exitCode=0 Mar 20 15:25:52 crc kubenswrapper[4764]: I0320 15:25:52.355275 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qxt87" event={"ID":"ef036762-4081-4a2c-a8f7-249b11310308","Type":"ContainerDied","Data":"7c25d612d6eae6751fb9eb18171c3176dee04e0b2d1e36fb6e89145c045881da"} Mar 20 15:25:53 crc kubenswrapper[4764]: I0320 15:25:53.368294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qxt87" event={"ID":"ef036762-4081-4a2c-a8f7-249b11310308","Type":"ContainerStarted","Data":"62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf"} Mar 20 15:25:53 crc kubenswrapper[4764]: I0320 15:25:53.388142 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qxt87" podStartSLOduration=2.8347101070000003 podStartE2EDuration="5.38812837s" podCreationTimestamp="2026-03-20 15:25:48 +0000 UTC" firstStartedPulling="2026-03-20 15:25:50.340099633 +0000 UTC m=+2071.956288782" lastFinishedPulling="2026-03-20 15:25:52.893517906 +0000 UTC m=+2074.509707045" observedRunningTime="2026-03-20 15:25:53.384937082 +0000 UTC m=+2075.001126211" watchObservedRunningTime="2026-03-20 15:25:53.38812837 +0000 UTC m=+2075.004317499" Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.010585 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xgsj9"] Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.016784 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.021062 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgsj9"] Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.117683 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-utilities\") pod \"community-operators-xgsj9\" (UID: \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\") " pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.118037 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-catalog-content\") pod \"community-operators-xgsj9\" (UID: \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\") " pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.118275 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9pxk\" (UniqueName: \"kubernetes.io/projected/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-kube-api-access-t9pxk\") pod \"community-operators-xgsj9\" (UID: \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\") " pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.220426 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9pxk\" (UniqueName: \"kubernetes.io/projected/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-kube-api-access-t9pxk\") pod \"community-operators-xgsj9\" (UID: \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\") " pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.220577 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-utilities\") pod \"community-operators-xgsj9\" (UID: \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\") " pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.220751 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-catalog-content\") pod \"community-operators-xgsj9\" (UID: \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\") " pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.221268 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-catalog-content\") pod \"community-operators-xgsj9\" (UID: \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\") " pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.223807 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-utilities\") pod \"community-operators-xgsj9\" (UID: \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\") " pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.245669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9pxk\" (UniqueName: \"kubernetes.io/projected/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-kube-api-access-t9pxk\") pod \"community-operators-xgsj9\" (UID: \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\") " pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.346568 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:25:57 crc kubenswrapper[4764]: I0320 15:25:57.828195 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgsj9"] Mar 20 15:25:58 crc kubenswrapper[4764]: I0320 15:25:58.414694 4764 generic.go:334] "Generic (PLEG): container finished" podID="3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" containerID="794df2df65f058fa470219575f02348e423332b000192d09114beb25d2c61269" exitCode=0 Mar 20 15:25:58 crc kubenswrapper[4764]: I0320 15:25:58.414734 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgsj9" event={"ID":"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea","Type":"ContainerDied","Data":"794df2df65f058fa470219575f02348e423332b000192d09114beb25d2c61269"} Mar 20 15:25:58 crc kubenswrapper[4764]: I0320 15:25:58.414756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgsj9" event={"ID":"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea","Type":"ContainerStarted","Data":"d09efbb646b899c751a8785ff97dd319ff335351266984a6e44bcbc9ee20ce17"} Mar 20 15:25:58 crc kubenswrapper[4764]: I0320 15:25:58.986215 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:58 crc kubenswrapper[4764]: I0320 15:25:58.986485 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:59 crc kubenswrapper[4764]: I0320 15:25:59.045449 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:25:59 crc kubenswrapper[4764]: I0320 15:25:59.437277 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgsj9" event={"ID":"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea","Type":"ContainerStarted","Data":"b30867a09bdab12333585569d867ce3cde9b4cdaa76fc080aee1e251b8e89171"} Mar 20 15:25:59 crc kubenswrapper[4764]: I0320 15:25:59.504992 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.151749 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567006-xnnkt"] Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.153193 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567006-xnnkt" Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.159450 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.159622 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.159950 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.168341 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567006-xnnkt"] Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.285188 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8qkz\" (UniqueName: \"kubernetes.io/projected/219079ac-b1c7-4b7a-8e7f-4f75e1fc2226-kube-api-access-n8qkz\") pod \"auto-csr-approver-29567006-xnnkt\" (UID: \"219079ac-b1c7-4b7a-8e7f-4f75e1fc2226\") " pod="openshift-infra/auto-csr-approver-29567006-xnnkt" Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.387558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8qkz\" (UniqueName: \"kubernetes.io/projected/219079ac-b1c7-4b7a-8e7f-4f75e1fc2226-kube-api-access-n8qkz\") pod \"auto-csr-approver-29567006-xnnkt\" (UID: \"219079ac-b1c7-4b7a-8e7f-4f75e1fc2226\") " pod="openshift-infra/auto-csr-approver-29567006-xnnkt" Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.408959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8qkz\" (UniqueName: \"kubernetes.io/projected/219079ac-b1c7-4b7a-8e7f-4f75e1fc2226-kube-api-access-n8qkz\") pod \"auto-csr-approver-29567006-xnnkt\" (UID: \"219079ac-b1c7-4b7a-8e7f-4f75e1fc2226\") " pod="openshift-infra/auto-csr-approver-29567006-xnnkt" Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.451490 4764 generic.go:334] "Generic (PLEG): container finished" podID="3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" containerID="b30867a09bdab12333585569d867ce3cde9b4cdaa76fc080aee1e251b8e89171" exitCode=0 Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.451543 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgsj9" event={"ID":"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea","Type":"ContainerDied","Data":"b30867a09bdab12333585569d867ce3cde9b4cdaa76fc080aee1e251b8e89171"} Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.472414 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567006-xnnkt" Mar 20 15:26:00 crc kubenswrapper[4764]: I0320 15:26:00.928227 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567006-xnnkt"] Mar 20 15:26:01 crc kubenswrapper[4764]: I0320 15:26:01.363780 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qxt87"] Mar 20 15:26:01 crc kubenswrapper[4764]: I0320 15:26:01.465558 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgsj9" event={"ID":"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea","Type":"ContainerStarted","Data":"a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c"} Mar 20 15:26:01 crc kubenswrapper[4764]: I0320 15:26:01.468498 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567006-xnnkt" event={"ID":"219079ac-b1c7-4b7a-8e7f-4f75e1fc2226","Type":"ContainerStarted","Data":"104d5fa756c850a847762da738526b714c570cce63ac1e4a51cdd8e5454d5580"} Mar 20 15:26:01 crc kubenswrapper[4764]: I0320 15:26:01.468612 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qxt87" podUID="ef036762-4081-4a2c-a8f7-249b11310308" containerName="registry-server" containerID="cri-o://62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf" gracePeriod=2 Mar 20 15:26:01 crc kubenswrapper[4764]: I0320 15:26:01.494489 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xgsj9" podStartSLOduration=3.021885912 podStartE2EDuration="5.494466269s" podCreationTimestamp="2026-03-20 15:25:56 +0000 UTC" firstStartedPulling="2026-03-20 15:25:58.417373743 +0000 UTC m=+2080.033562872" lastFinishedPulling="2026-03-20 15:26:00.8899541 +0000 UTC m=+2082.506143229" observedRunningTime="2026-03-20 15:26:01.48113169 +0000 UTC m=+2083.097320829" watchObservedRunningTime="2026-03-20 15:26:01.494466269 +0000 UTC m=+2083.110655398" Mar 20 15:26:01 crc kubenswrapper[4764]: I0320 15:26:01.965662 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.126095 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef036762-4081-4a2c-a8f7-249b11310308-catalog-content\") pod \"ef036762-4081-4a2c-a8f7-249b11310308\" (UID: \"ef036762-4081-4a2c-a8f7-249b11310308\") " Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.126241 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4tpc\" (UniqueName: \"kubernetes.io/projected/ef036762-4081-4a2c-a8f7-249b11310308-kube-api-access-p4tpc\") pod \"ef036762-4081-4a2c-a8f7-249b11310308\" (UID: \"ef036762-4081-4a2c-a8f7-249b11310308\") " Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.126306 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef036762-4081-4a2c-a8f7-249b11310308-utilities\") pod \"ef036762-4081-4a2c-a8f7-249b11310308\" (UID: \"ef036762-4081-4a2c-a8f7-249b11310308\") " Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.127670 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef036762-4081-4a2c-a8f7-249b11310308-utilities" (OuterVolumeSpecName: "utilities") pod "ef036762-4081-4a2c-a8f7-249b11310308" (UID: "ef036762-4081-4a2c-a8f7-249b11310308"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.131584 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef036762-4081-4a2c-a8f7-249b11310308-kube-api-access-p4tpc" (OuterVolumeSpecName: "kube-api-access-p4tpc") pod "ef036762-4081-4a2c-a8f7-249b11310308" (UID: "ef036762-4081-4a2c-a8f7-249b11310308"). InnerVolumeSpecName "kube-api-access-p4tpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.183826 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef036762-4081-4a2c-a8f7-249b11310308-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef036762-4081-4a2c-a8f7-249b11310308" (UID: "ef036762-4081-4a2c-a8f7-249b11310308"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.228219 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef036762-4081-4a2c-a8f7-249b11310308-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.228268 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef036762-4081-4a2c-a8f7-249b11310308-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.228287 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4tpc\" (UniqueName: \"kubernetes.io/projected/ef036762-4081-4a2c-a8f7-249b11310308-kube-api-access-p4tpc\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.480612 4764 generic.go:334] "Generic (PLEG): container finished" podID="ef036762-4081-4a2c-a8f7-249b11310308" containerID="62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf" exitCode=0 Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.480684 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qxt87" event={"ID":"ef036762-4081-4a2c-a8f7-249b11310308","Type":"ContainerDied","Data":"62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf"} Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.480983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qxt87" event={"ID":"ef036762-4081-4a2c-a8f7-249b11310308","Type":"ContainerDied","Data":"5a2d3bebaa63d0f9a136e7bf79615a29b505d549241ceda54a5059f8cbcc2ea2"} Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.481004 4764 scope.go:117] "RemoveContainer" containerID="62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.480743 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qxt87" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.482563 4764 generic.go:334] "Generic (PLEG): container finished" podID="219079ac-b1c7-4b7a-8e7f-4f75e1fc2226" containerID="a5bf6dd5fc4e14d8d1cd9e5179a0b6c5645ba5b78b626aaf5252395c3023d2ff" exitCode=0 Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.482905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567006-xnnkt" event={"ID":"219079ac-b1c7-4b7a-8e7f-4f75e1fc2226","Type":"ContainerDied","Data":"a5bf6dd5fc4e14d8d1cd9e5179a0b6c5645ba5b78b626aaf5252395c3023d2ff"} Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.499871 4764 scope.go:117] "RemoveContainer" containerID="7c25d612d6eae6751fb9eb18171c3176dee04e0b2d1e36fb6e89145c045881da" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.524372 4764 scope.go:117] "RemoveContainer" containerID="0ef322b21325bcaee313b7a5cff9c292b61e49e2d14b9245b7af77c306c10385" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.540591 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qxt87"] Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.550410 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qxt87"] Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.555277 4764 scope.go:117] "RemoveContainer" containerID="62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf" Mar 20 15:26:02 crc kubenswrapper[4764]: E0320 15:26:02.555759 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf\": container with ID starting with 62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf not found: ID does not exist" containerID="62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.555842 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf"} err="failed to get container status \"62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf\": rpc error: code = NotFound desc = could not find container \"62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf\": container with ID starting with 62d3a4f3b71ad5e18f5c32e6e835dca0f187d20fdb1e5481240f5484a5663edf not found: ID does not exist" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.555874 4764 scope.go:117] "RemoveContainer" containerID="7c25d612d6eae6751fb9eb18171c3176dee04e0b2d1e36fb6e89145c045881da" Mar 20 15:26:02 crc kubenswrapper[4764]: E0320 15:26:02.556428 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c25d612d6eae6751fb9eb18171c3176dee04e0b2d1e36fb6e89145c045881da\": container with ID starting with 7c25d612d6eae6751fb9eb18171c3176dee04e0b2d1e36fb6e89145c045881da not found: ID does not exist" containerID="7c25d612d6eae6751fb9eb18171c3176dee04e0b2d1e36fb6e89145c045881da" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.556467 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c25d612d6eae6751fb9eb18171c3176dee04e0b2d1e36fb6e89145c045881da"} err="failed to get container status \"7c25d612d6eae6751fb9eb18171c3176dee04e0b2d1e36fb6e89145c045881da\": rpc error: code = NotFound desc = could not find container \"7c25d612d6eae6751fb9eb18171c3176dee04e0b2d1e36fb6e89145c045881da\": container with ID starting with 7c25d612d6eae6751fb9eb18171c3176dee04e0b2d1e36fb6e89145c045881da not found: ID does not exist" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.556493 4764 scope.go:117] "RemoveContainer" containerID="0ef322b21325bcaee313b7a5cff9c292b61e49e2d14b9245b7af77c306c10385" Mar 20 15:26:02 crc kubenswrapper[4764]: E0320 15:26:02.556949 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ef322b21325bcaee313b7a5cff9c292b61e49e2d14b9245b7af77c306c10385\": container with ID starting with 0ef322b21325bcaee313b7a5cff9c292b61e49e2d14b9245b7af77c306c10385 not found: ID does not exist" containerID="0ef322b21325bcaee313b7a5cff9c292b61e49e2d14b9245b7af77c306c10385" Mar 20 15:26:02 crc kubenswrapper[4764]: I0320 15:26:02.556983 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ef322b21325bcaee313b7a5cff9c292b61e49e2d14b9245b7af77c306c10385"} err="failed to get container status \"0ef322b21325bcaee313b7a5cff9c292b61e49e2d14b9245b7af77c306c10385\": rpc error: code = NotFound desc = could not find container \"0ef322b21325bcaee313b7a5cff9c292b61e49e2d14b9245b7af77c306c10385\": container with ID starting with 0ef322b21325bcaee313b7a5cff9c292b61e49e2d14b9245b7af77c306c10385 not found: ID does not exist" Mar 20 15:26:03 crc kubenswrapper[4764]: I0320 15:26:03.138907 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef036762-4081-4a2c-a8f7-249b11310308" path="/var/lib/kubelet/pods/ef036762-4081-4a2c-a8f7-249b11310308/volumes" Mar 20 15:26:03 crc kubenswrapper[4764]: I0320 15:26:03.882316 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567006-xnnkt" Mar 20 15:26:04 crc kubenswrapper[4764]: I0320 15:26:04.074253 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8qkz\" (UniqueName: \"kubernetes.io/projected/219079ac-b1c7-4b7a-8e7f-4f75e1fc2226-kube-api-access-n8qkz\") pod \"219079ac-b1c7-4b7a-8e7f-4f75e1fc2226\" (UID: \"219079ac-b1c7-4b7a-8e7f-4f75e1fc2226\") " Mar 20 15:26:04 crc kubenswrapper[4764]: I0320 15:26:04.080573 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219079ac-b1c7-4b7a-8e7f-4f75e1fc2226-kube-api-access-n8qkz" (OuterVolumeSpecName: "kube-api-access-n8qkz") pod "219079ac-b1c7-4b7a-8e7f-4f75e1fc2226" (UID: "219079ac-b1c7-4b7a-8e7f-4f75e1fc2226"). InnerVolumeSpecName "kube-api-access-n8qkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:26:04 crc kubenswrapper[4764]: I0320 15:26:04.176287 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8qkz\" (UniqueName: \"kubernetes.io/projected/219079ac-b1c7-4b7a-8e7f-4f75e1fc2226-kube-api-access-n8qkz\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:04 crc kubenswrapper[4764]: I0320 15:26:04.508050 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567006-xnnkt" event={"ID":"219079ac-b1c7-4b7a-8e7f-4f75e1fc2226","Type":"ContainerDied","Data":"104d5fa756c850a847762da738526b714c570cce63ac1e4a51cdd8e5454d5580"} Mar 20 15:26:04 crc kubenswrapper[4764]: I0320 15:26:04.508426 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="104d5fa756c850a847762da738526b714c570cce63ac1e4a51cdd8e5454d5580" Mar 20 15:26:04 crc kubenswrapper[4764]: I0320 15:26:04.508185 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567006-xnnkt" Mar 20 15:26:04 crc kubenswrapper[4764]: I0320 15:26:04.958173 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567000-r6rgv"] Mar 20 15:26:04 crc kubenswrapper[4764]: I0320 15:26:04.967435 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567000-r6rgv"] Mar 20 15:26:05 crc kubenswrapper[4764]: I0320 15:26:05.139011 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9668ba6-0faf-40eb-a0a2-9d0557167dda" path="/var/lib/kubelet/pods/b9668ba6-0faf-40eb-a0a2-9d0557167dda/volumes" Mar 20 15:26:07 crc kubenswrapper[4764]: I0320 15:26:07.347703 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:26:07 crc kubenswrapper[4764]: I0320 15:26:07.348099 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:26:07 crc kubenswrapper[4764]: I0320 15:26:07.410731 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:26:07 crc kubenswrapper[4764]: I0320 15:26:07.581038 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:26:07 crc kubenswrapper[4764]: I0320 15:26:07.651210 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgsj9"] Mar 20 15:26:08 crc kubenswrapper[4764]: I0320 15:26:08.444034 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:26:08 crc kubenswrapper[4764]: I0320 15:26:08.444466 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:26:08 crc kubenswrapper[4764]: I0320 15:26:08.444529 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 15:26:08 crc kubenswrapper[4764]: I0320 15:26:08.445438 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"682d34616f4f512a04218ac49f94a47dab3b53e7ea4a05247eddd63a6d004a5c"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:26:08 crc kubenswrapper[4764]: I0320 15:26:08.445509 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://682d34616f4f512a04218ac49f94a47dab3b53e7ea4a05247eddd63a6d004a5c" gracePeriod=600 Mar 20 15:26:09 crc kubenswrapper[4764]: I0320 15:26:09.327801 4764 scope.go:117] "RemoveContainer" containerID="94cd599e827ac5d634a1c4690843f72536f703ae1d9e1e0adaeb1969834c5aaa" Mar 20 15:26:09 crc kubenswrapper[4764]: I0320 15:26:09.556169 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="682d34616f4f512a04218ac49f94a47dab3b53e7ea4a05247eddd63a6d004a5c" exitCode=0 Mar 20 15:26:09 crc kubenswrapper[4764]: I0320 15:26:09.556393 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xgsj9" podUID="3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" containerName="registry-server" containerID="cri-o://a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c" gracePeriod=2 Mar 20 15:26:09 crc kubenswrapper[4764]: I0320 15:26:09.556669 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"682d34616f4f512a04218ac49f94a47dab3b53e7ea4a05247eddd63a6d004a5c"} Mar 20 15:26:09 crc kubenswrapper[4764]: I0320 15:26:09.556699 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5"} Mar 20 15:26:09 crc kubenswrapper[4764]: I0320 15:26:09.556719 4764 scope.go:117] "RemoveContainer" containerID="b88ffca008c07a3e30a960a27169368f3c4a70902b77ad07c9e373dfd45f48be" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.092073 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.192758 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9pxk\" (UniqueName: \"kubernetes.io/projected/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-kube-api-access-t9pxk\") pod \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\" (UID: \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\") " Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.192807 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-catalog-content\") pod \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\" (UID: \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\") " Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.193023 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-utilities\") pod \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\" (UID: \"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea\") " Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.194478 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-utilities" (OuterVolumeSpecName: "utilities") pod "3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" (UID: "3b5bc25e-f0f6-47d1-b64f-e2fce5381dea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.211703 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-kube-api-access-t9pxk" (OuterVolumeSpecName: "kube-api-access-t9pxk") pod "3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" (UID: "3b5bc25e-f0f6-47d1-b64f-e2fce5381dea"). InnerVolumeSpecName "kube-api-access-t9pxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.295417 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9pxk\" (UniqueName: \"kubernetes.io/projected/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-kube-api-access-t9pxk\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.295441 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.568858 4764 generic.go:334] "Generic (PLEG): container finished" podID="3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" containerID="a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c" exitCode=0 Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.568914 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgsj9" event={"ID":"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea","Type":"ContainerDied","Data":"a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c"} Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.568948 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgsj9" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.568968 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgsj9" event={"ID":"3b5bc25e-f0f6-47d1-b64f-e2fce5381dea","Type":"ContainerDied","Data":"d09efbb646b899c751a8785ff97dd319ff335351266984a6e44bcbc9ee20ce17"} Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.568986 4764 scope.go:117] "RemoveContainer" containerID="a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.594216 4764 scope.go:117] "RemoveContainer" containerID="b30867a09bdab12333585569d867ce3cde9b4cdaa76fc080aee1e251b8e89171" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.620556 4764 scope.go:117] "RemoveContainer" containerID="794df2df65f058fa470219575f02348e423332b000192d09114beb25d2c61269" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.663989 4764 scope.go:117] "RemoveContainer" containerID="a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c" Mar 20 15:26:10 crc kubenswrapper[4764]: E0320 15:26:10.664530 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c\": container with ID starting with a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c not found: ID does not exist" containerID="a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.664579 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c"} err="failed to get container status \"a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c\": rpc error: code = NotFound desc = could not find container \"a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c\": container with ID starting with a9ae872b7a01f28c04abd953e722fcb0124a5f170e50cf93317f91111df5bb9c not found: ID does not exist" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.664605 4764 scope.go:117] "RemoveContainer" containerID="b30867a09bdab12333585569d867ce3cde9b4cdaa76fc080aee1e251b8e89171" Mar 20 15:26:10 crc kubenswrapper[4764]: E0320 15:26:10.665027 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b30867a09bdab12333585569d867ce3cde9b4cdaa76fc080aee1e251b8e89171\": container with ID starting with b30867a09bdab12333585569d867ce3cde9b4cdaa76fc080aee1e251b8e89171 not found: ID does not exist" containerID="b30867a09bdab12333585569d867ce3cde9b4cdaa76fc080aee1e251b8e89171" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.665066 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b30867a09bdab12333585569d867ce3cde9b4cdaa76fc080aee1e251b8e89171"} err="failed to get container status \"b30867a09bdab12333585569d867ce3cde9b4cdaa76fc080aee1e251b8e89171\": rpc error: code = NotFound desc = could not find container \"b30867a09bdab12333585569d867ce3cde9b4cdaa76fc080aee1e251b8e89171\": container with ID starting with b30867a09bdab12333585569d867ce3cde9b4cdaa76fc080aee1e251b8e89171 not found: ID does not exist" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.665087 4764 scope.go:117] "RemoveContainer" containerID="794df2df65f058fa470219575f02348e423332b000192d09114beb25d2c61269" Mar 20 15:26:10 crc kubenswrapper[4764]: E0320 15:26:10.665547 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794df2df65f058fa470219575f02348e423332b000192d09114beb25d2c61269\": container with ID starting with 794df2df65f058fa470219575f02348e423332b000192d09114beb25d2c61269 not found: ID does not exist" containerID="794df2df65f058fa470219575f02348e423332b000192d09114beb25d2c61269" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.665597 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794df2df65f058fa470219575f02348e423332b000192d09114beb25d2c61269"} err="failed to get container status \"794df2df65f058fa470219575f02348e423332b000192d09114beb25d2c61269\": rpc error: code = NotFound desc = could not find container \"794df2df65f058fa470219575f02348e423332b000192d09114beb25d2c61269\": container with ID starting with 794df2df65f058fa470219575f02348e423332b000192d09114beb25d2c61269 not found: ID does not exist" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.886849 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" (UID: "3b5bc25e-f0f6-47d1-b64f-e2fce5381dea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:26:10 crc kubenswrapper[4764]: I0320 15:26:10.906215 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:11 crc kubenswrapper[4764]: I0320 15:26:11.192823 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgsj9"] Mar 20 15:26:11 crc kubenswrapper[4764]: I0320 15:26:11.202242 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xgsj9"] Mar 20 15:26:13 crc kubenswrapper[4764]: I0320 15:26:13.137511 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" path="/var/lib/kubelet/pods/3b5bc25e-f0f6-47d1-b64f-e2fce5381dea/volumes" Mar 20 15:26:41 crc kubenswrapper[4764]: I0320 15:26:41.908608 4764 generic.go:334] "Generic (PLEG): container finished" podID="28a506a3-463d-4bc4-ab93-2e8201878e60" containerID="d4ec91c715574acc5ffce7b60a5a0007f0f781b3fd3b641b3940ae6b4ea7a54c" exitCode=0 Mar 20 15:26:41 crc kubenswrapper[4764]: I0320 15:26:41.908751 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" event={"ID":"28a506a3-463d-4bc4-ab93-2e8201878e60","Type":"ContainerDied","Data":"d4ec91c715574acc5ffce7b60a5a0007f0f781b3fd3b641b3940ae6b4ea7a54c"} Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.356942 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.464317 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28a506a3-463d-4bc4-ab93-2e8201878e60-ovncontroller-config-0\") pod \"28a506a3-463d-4bc4-ab93-2e8201878e60\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.464517 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8hmr\" (UniqueName: \"kubernetes.io/projected/28a506a3-463d-4bc4-ab93-2e8201878e60-kube-api-access-v8hmr\") pod \"28a506a3-463d-4bc4-ab93-2e8201878e60\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.464599 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-ssh-key-openstack-edpm-ipam\") pod \"28a506a3-463d-4bc4-ab93-2e8201878e60\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.464639 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-ovn-combined-ca-bundle\") pod \"28a506a3-463d-4bc4-ab93-2e8201878e60\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.464689 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-inventory\") pod \"28a506a3-463d-4bc4-ab93-2e8201878e60\" (UID: \"28a506a3-463d-4bc4-ab93-2e8201878e60\") " Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.470645 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a506a3-463d-4bc4-ab93-2e8201878e60-kube-api-access-v8hmr" (OuterVolumeSpecName: "kube-api-access-v8hmr") pod "28a506a3-463d-4bc4-ab93-2e8201878e60" (UID: "28a506a3-463d-4bc4-ab93-2e8201878e60"). InnerVolumeSpecName "kube-api-access-v8hmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.471442 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "28a506a3-463d-4bc4-ab93-2e8201878e60" (UID: "28a506a3-463d-4bc4-ab93-2e8201878e60"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.497209 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a506a3-463d-4bc4-ab93-2e8201878e60-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "28a506a3-463d-4bc4-ab93-2e8201878e60" (UID: "28a506a3-463d-4bc4-ab93-2e8201878e60"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.502232 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28a506a3-463d-4bc4-ab93-2e8201878e60" (UID: "28a506a3-463d-4bc4-ab93-2e8201878e60"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.519406 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-inventory" (OuterVolumeSpecName: "inventory") pod "28a506a3-463d-4bc4-ab93-2e8201878e60" (UID: "28a506a3-463d-4bc4-ab93-2e8201878e60"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.567524 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8hmr\" (UniqueName: \"kubernetes.io/projected/28a506a3-463d-4bc4-ab93-2e8201878e60-kube-api-access-v8hmr\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.577671 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.577693 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.577709 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28a506a3-463d-4bc4-ab93-2e8201878e60-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.577723 4764 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/28a506a3-463d-4bc4-ab93-2e8201878e60-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.931563 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" event={"ID":"28a506a3-463d-4bc4-ab93-2e8201878e60","Type":"ContainerDied","Data":"1feaa14a733dfb74b7b59b38c47d314111dc06dbeffd7ba96563f772d75dfd14"} Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.931633 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1feaa14a733dfb74b7b59b38c47d314111dc06dbeffd7ba96563f772d75dfd14" Mar 20 15:26:43 crc kubenswrapper[4764]: I0320 15:26:43.931644 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v2tgk" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026150 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt"] Mar 20 15:26:44 crc kubenswrapper[4764]: E0320 15:26:44.026509 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a506a3-463d-4bc4-ab93-2e8201878e60" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026527 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a506a3-463d-4bc4-ab93-2e8201878e60" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 15:26:44 crc kubenswrapper[4764]: E0320 15:26:44.026546 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" containerName="registry-server" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026553 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" containerName="registry-server" Mar 20 15:26:44 crc kubenswrapper[4764]: E0320 15:26:44.026562 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef036762-4081-4a2c-a8f7-249b11310308" containerName="extract-content" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026569 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef036762-4081-4a2c-a8f7-249b11310308" containerName="extract-content" Mar 20 15:26:44 crc kubenswrapper[4764]: E0320 15:26:44.026582 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" containerName="extract-content" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026587 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" containerName="extract-content" Mar 20 15:26:44 crc kubenswrapper[4764]: E0320 15:26:44.026597 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" containerName="extract-utilities" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026602 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" containerName="extract-utilities" Mar 20 15:26:44 crc kubenswrapper[4764]: E0320 15:26:44.026610 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef036762-4081-4a2c-a8f7-249b11310308" containerName="extract-utilities" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026617 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef036762-4081-4a2c-a8f7-249b11310308" containerName="extract-utilities" Mar 20 15:26:44 crc kubenswrapper[4764]: E0320 15:26:44.026633 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef036762-4081-4a2c-a8f7-249b11310308" containerName="registry-server" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026639 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef036762-4081-4a2c-a8f7-249b11310308" containerName="registry-server" Mar 20 15:26:44 crc kubenswrapper[4764]: E0320 15:26:44.026649 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219079ac-b1c7-4b7a-8e7f-4f75e1fc2226" containerName="oc" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026654 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="219079ac-b1c7-4b7a-8e7f-4f75e1fc2226" containerName="oc" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026817 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a506a3-463d-4bc4-ab93-2e8201878e60" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026840 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="219079ac-b1c7-4b7a-8e7f-4f75e1fc2226" containerName="oc" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026853 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef036762-4081-4a2c-a8f7-249b11310308" containerName="registry-server" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.026872 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5bc25e-f0f6-47d1-b64f-e2fce5381dea" containerName="registry-server" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.027478 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.029670 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.031351 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.031397 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.031644 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.031696 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.031761 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.042668 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt"] Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.090449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwhxz\" (UniqueName: \"kubernetes.io/projected/c81117b3-6d55-444b-bed2-9b7eac23bf8e-kube-api-access-vwhxz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.090502 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.090603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.090635 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.090720 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.090756 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.192792 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.193488 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.193594 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwhxz\" (UniqueName: \"kubernetes.io/projected/c81117b3-6d55-444b-bed2-9b7eac23bf8e-kube-api-access-vwhxz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.193618 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.193719 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.193777 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.196623 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.197771 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.197854 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.198826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.212550 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwhxz\" (UniqueName: \"kubernetes.io/projected/c81117b3-6d55-444b-bed2-9b7eac23bf8e-kube-api-access-vwhxz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.215405 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.363283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.957870 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt"] Mar 20 15:26:44 crc kubenswrapper[4764]: I0320 15:26:44.981965 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:26:45 crc kubenswrapper[4764]: I0320 15:26:45.951082 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" event={"ID":"c81117b3-6d55-444b-bed2-9b7eac23bf8e","Type":"ContainerStarted","Data":"3a771e67f4bc6ade8212dd8d6a372528e0b139fa5749afb7f9168869487e077d"} Mar 20 15:26:45 crc kubenswrapper[4764]: I0320 15:26:45.951659 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" event={"ID":"c81117b3-6d55-444b-bed2-9b7eac23bf8e","Type":"ContainerStarted","Data":"475a094bf0775d0f751cc77150efb4defc5ea8e4e30800931e308770745b1f78"} Mar 20 15:26:45 crc kubenswrapper[4764]: I0320 15:26:45.979792 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" podStartSLOduration=1.331232066 podStartE2EDuration="1.979763563s" podCreationTimestamp="2026-03-20 15:26:44 +0000 UTC" firstStartedPulling="2026-03-20 15:26:44.981772981 +0000 UTC m=+2126.597962110" lastFinishedPulling="2026-03-20 15:26:45.630304478 +0000 UTC m=+2127.246493607" observedRunningTime="2026-03-20 15:26:45.977164324 +0000 UTC m=+2127.593353463" watchObservedRunningTime="2026-03-20 15:26:45.979763563 +0000 UTC m=+2127.595952732" Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.194278 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7kkvk"] Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.198534 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.215858 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7kkvk"] Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.329676 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd4nt\" (UniqueName: \"kubernetes.io/projected/a9e39dea-d514-49af-9db0-29a5036609c4-kube-api-access-kd4nt\") pod \"redhat-operators-7kkvk\" (UID: \"a9e39dea-d514-49af-9db0-29a5036609c4\") " pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.329757 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e39dea-d514-49af-9db0-29a5036609c4-catalog-content\") pod \"redhat-operators-7kkvk\" (UID: \"a9e39dea-d514-49af-9db0-29a5036609c4\") " pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.329842 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e39dea-d514-49af-9db0-29a5036609c4-utilities\") pod \"redhat-operators-7kkvk\" (UID: \"a9e39dea-d514-49af-9db0-29a5036609c4\") " pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.431362 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd4nt\" (UniqueName: \"kubernetes.io/projected/a9e39dea-d514-49af-9db0-29a5036609c4-kube-api-access-kd4nt\") pod \"redhat-operators-7kkvk\" (UID: \"a9e39dea-d514-49af-9db0-29a5036609c4\") " pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.431477 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e39dea-d514-49af-9db0-29a5036609c4-catalog-content\") pod \"redhat-operators-7kkvk\" (UID: \"a9e39dea-d514-49af-9db0-29a5036609c4\") " pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.431556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e39dea-d514-49af-9db0-29a5036609c4-utilities\") pod \"redhat-operators-7kkvk\" (UID: \"a9e39dea-d514-49af-9db0-29a5036609c4\") " pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.432144 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e39dea-d514-49af-9db0-29a5036609c4-utilities\") pod \"redhat-operators-7kkvk\" (UID: \"a9e39dea-d514-49af-9db0-29a5036609c4\") " pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.432905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e39dea-d514-49af-9db0-29a5036609c4-catalog-content\") pod \"redhat-operators-7kkvk\" (UID: \"a9e39dea-d514-49af-9db0-29a5036609c4\") " pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.462531 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd4nt\" (UniqueName: \"kubernetes.io/projected/a9e39dea-d514-49af-9db0-29a5036609c4-kube-api-access-kd4nt\") pod \"redhat-operators-7kkvk\" (UID: \"a9e39dea-d514-49af-9db0-29a5036609c4\") " pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:26:58 crc kubenswrapper[4764]: I0320 15:26:58.532447 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:26:59 crc kubenswrapper[4764]: I0320 15:26:59.083086 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7kkvk"] Mar 20 15:26:59 crc kubenswrapper[4764]: W0320 15:26:59.089376 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9e39dea_d514_49af_9db0_29a5036609c4.slice/crio-df6e12d400ee2e6a51d07dadeb19a08fd54aecbc7f19e93f110cf6167e186ebf WatchSource:0}: Error finding container df6e12d400ee2e6a51d07dadeb19a08fd54aecbc7f19e93f110cf6167e186ebf: Status 404 returned error can't find the container with id df6e12d400ee2e6a51d07dadeb19a08fd54aecbc7f19e93f110cf6167e186ebf Mar 20 15:26:59 crc kubenswrapper[4764]: I0320 15:26:59.300882 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kkvk" event={"ID":"a9e39dea-d514-49af-9db0-29a5036609c4","Type":"ContainerStarted","Data":"df6e12d400ee2e6a51d07dadeb19a08fd54aecbc7f19e93f110cf6167e186ebf"} Mar 20 15:27:00 crc kubenswrapper[4764]: I0320 15:27:00.321034 4764 generic.go:334] "Generic (PLEG): container finished" podID="a9e39dea-d514-49af-9db0-29a5036609c4" containerID="6aaae630ea24dfd4416e358096c494ad7b9f2f28431255db7ab27a4e169f6a38" exitCode=0 Mar 20 15:27:00 crc kubenswrapper[4764]: I0320 15:27:00.321140 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kkvk" event={"ID":"a9e39dea-d514-49af-9db0-29a5036609c4","Type":"ContainerDied","Data":"6aaae630ea24dfd4416e358096c494ad7b9f2f28431255db7ab27a4e169f6a38"} Mar 20 15:27:02 crc kubenswrapper[4764]: I0320 15:27:02.348230 4764 generic.go:334] "Generic (PLEG): container finished" podID="a9e39dea-d514-49af-9db0-29a5036609c4" containerID="4beaf22b6eaf900286b1948e2762601debdf2ccd1d74fe46fb52485450f59d0a" exitCode=0 Mar 20 15:27:02 crc kubenswrapper[4764]: I0320 15:27:02.348321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kkvk" event={"ID":"a9e39dea-d514-49af-9db0-29a5036609c4","Type":"ContainerDied","Data":"4beaf22b6eaf900286b1948e2762601debdf2ccd1d74fe46fb52485450f59d0a"} Mar 20 15:27:03 crc kubenswrapper[4764]: I0320 15:27:03.411987 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7kkvk" podStartSLOduration=2.615885075 podStartE2EDuration="5.41196089s" podCreationTimestamp="2026-03-20 15:26:58 +0000 UTC" firstStartedPulling="2026-03-20 15:27:00.325531704 +0000 UTC m=+2141.941720833" lastFinishedPulling="2026-03-20 15:27:03.121607489 +0000 UTC m=+2144.737796648" observedRunningTime="2026-03-20 15:27:03.399652815 +0000 UTC m=+2145.015841974" watchObservedRunningTime="2026-03-20 15:27:03.41196089 +0000 UTC m=+2145.028150049" Mar 20 15:27:04 crc kubenswrapper[4764]: I0320 15:27:04.385834 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kkvk" event={"ID":"a9e39dea-d514-49af-9db0-29a5036609c4","Type":"ContainerStarted","Data":"c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f"} Mar 20 15:27:08 crc kubenswrapper[4764]: I0320 15:27:08.533137 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:27:08 crc kubenswrapper[4764]: I0320 15:27:08.534584 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:27:09 crc kubenswrapper[4764]: I0320 15:27:09.579638 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7kkvk" podUID="a9e39dea-d514-49af-9db0-29a5036609c4" containerName="registry-server" probeResult="failure" output=< Mar 20 15:27:09 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 20 15:27:09 crc kubenswrapper[4764]: > Mar 20 15:27:18 crc kubenswrapper[4764]: I0320 15:27:18.587660 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:27:18 crc kubenswrapper[4764]: I0320 15:27:18.665994 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:27:18 crc kubenswrapper[4764]: I0320 15:27:18.827448 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7kkvk"] Mar 20 15:27:20 crc kubenswrapper[4764]: I0320 15:27:20.535896 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7kkvk" podUID="a9e39dea-d514-49af-9db0-29a5036609c4" containerName="registry-server" containerID="cri-o://c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f" gracePeriod=2 Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.519836 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.546103 4764 generic.go:334] "Generic (PLEG): container finished" podID="a9e39dea-d514-49af-9db0-29a5036609c4" containerID="c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f" exitCode=0 Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.546145 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7kkvk" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.546154 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kkvk" event={"ID":"a9e39dea-d514-49af-9db0-29a5036609c4","Type":"ContainerDied","Data":"c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f"} Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.546192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kkvk" event={"ID":"a9e39dea-d514-49af-9db0-29a5036609c4","Type":"ContainerDied","Data":"df6e12d400ee2e6a51d07dadeb19a08fd54aecbc7f19e93f110cf6167e186ebf"} Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.546215 4764 scope.go:117] "RemoveContainer" containerID="c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.573610 4764 scope.go:117] "RemoveContainer" containerID="4beaf22b6eaf900286b1948e2762601debdf2ccd1d74fe46fb52485450f59d0a" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.596748 4764 scope.go:117] "RemoveContainer" containerID="6aaae630ea24dfd4416e358096c494ad7b9f2f28431255db7ab27a4e169f6a38" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.622176 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e39dea-d514-49af-9db0-29a5036609c4-utilities\") pod \"a9e39dea-d514-49af-9db0-29a5036609c4\" (UID: \"a9e39dea-d514-49af-9db0-29a5036609c4\") " Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.622954 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9e39dea-d514-49af-9db0-29a5036609c4-utilities" (OuterVolumeSpecName: "utilities") pod "a9e39dea-d514-49af-9db0-29a5036609c4" (UID: "a9e39dea-d514-49af-9db0-29a5036609c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.623164 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd4nt\" (UniqueName: \"kubernetes.io/projected/a9e39dea-d514-49af-9db0-29a5036609c4-kube-api-access-kd4nt\") pod \"a9e39dea-d514-49af-9db0-29a5036609c4\" (UID: \"a9e39dea-d514-49af-9db0-29a5036609c4\") " Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.623242 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e39dea-d514-49af-9db0-29a5036609c4-catalog-content\") pod \"a9e39dea-d514-49af-9db0-29a5036609c4\" (UID: \"a9e39dea-d514-49af-9db0-29a5036609c4\") " Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.626316 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e39dea-d514-49af-9db0-29a5036609c4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.629698 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e39dea-d514-49af-9db0-29a5036609c4-kube-api-access-kd4nt" (OuterVolumeSpecName: "kube-api-access-kd4nt") pod "a9e39dea-d514-49af-9db0-29a5036609c4" (UID: "a9e39dea-d514-49af-9db0-29a5036609c4"). InnerVolumeSpecName "kube-api-access-kd4nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.684539 4764 scope.go:117] "RemoveContainer" containerID="c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f" Mar 20 15:27:21 crc kubenswrapper[4764]: E0320 15:27:21.685694 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f\": container with ID starting with c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f not found: ID does not exist" containerID="c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.685756 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f"} err="failed to get container status \"c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f\": rpc error: code = NotFound desc = could not find container \"c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f\": container with ID starting with c50396d1b609550254982bef44f7206194e6188ac79814c42468f0b4c2a8378f not found: ID does not exist" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.685790 4764 scope.go:117] "RemoveContainer" containerID="4beaf22b6eaf900286b1948e2762601debdf2ccd1d74fe46fb52485450f59d0a" Mar 20 15:27:21 crc kubenswrapper[4764]: E0320 15:27:21.686263 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4beaf22b6eaf900286b1948e2762601debdf2ccd1d74fe46fb52485450f59d0a\": container with ID starting with 4beaf22b6eaf900286b1948e2762601debdf2ccd1d74fe46fb52485450f59d0a not found: ID does not exist" containerID="4beaf22b6eaf900286b1948e2762601debdf2ccd1d74fe46fb52485450f59d0a" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.686510 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4beaf22b6eaf900286b1948e2762601debdf2ccd1d74fe46fb52485450f59d0a"} err="failed to get container status \"4beaf22b6eaf900286b1948e2762601debdf2ccd1d74fe46fb52485450f59d0a\": rpc error: code = NotFound desc = could not find container \"4beaf22b6eaf900286b1948e2762601debdf2ccd1d74fe46fb52485450f59d0a\": container with ID starting with 4beaf22b6eaf900286b1948e2762601debdf2ccd1d74fe46fb52485450f59d0a not found: ID does not exist" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.686739 4764 scope.go:117] "RemoveContainer" containerID="6aaae630ea24dfd4416e358096c494ad7b9f2f28431255db7ab27a4e169f6a38" Mar 20 15:27:21 crc kubenswrapper[4764]: E0320 15:27:21.687416 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aaae630ea24dfd4416e358096c494ad7b9f2f28431255db7ab27a4e169f6a38\": container with ID starting with 6aaae630ea24dfd4416e358096c494ad7b9f2f28431255db7ab27a4e169f6a38 not found: ID does not exist" containerID="6aaae630ea24dfd4416e358096c494ad7b9f2f28431255db7ab27a4e169f6a38" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.687500 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aaae630ea24dfd4416e358096c494ad7b9f2f28431255db7ab27a4e169f6a38"} err="failed to get container status \"6aaae630ea24dfd4416e358096c494ad7b9f2f28431255db7ab27a4e169f6a38\": rpc error: code = NotFound desc = could not find container \"6aaae630ea24dfd4416e358096c494ad7b9f2f28431255db7ab27a4e169f6a38\": container with ID starting with 6aaae630ea24dfd4416e358096c494ad7b9f2f28431255db7ab27a4e169f6a38 not found: ID does not exist" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.728852 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd4nt\" (UniqueName: \"kubernetes.io/projected/a9e39dea-d514-49af-9db0-29a5036609c4-kube-api-access-kd4nt\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.738212 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9e39dea-d514-49af-9db0-29a5036609c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9e39dea-d514-49af-9db0-29a5036609c4" (UID: "a9e39dea-d514-49af-9db0-29a5036609c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.831281 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e39dea-d514-49af-9db0-29a5036609c4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.926909 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7kkvk"] Mar 20 15:27:21 crc kubenswrapper[4764]: I0320 15:27:21.942975 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7kkvk"] Mar 20 15:27:23 crc kubenswrapper[4764]: I0320 15:27:23.135836 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e39dea-d514-49af-9db0-29a5036609c4" path="/var/lib/kubelet/pods/a9e39dea-d514-49af-9db0-29a5036609c4/volumes" Mar 20 15:27:35 crc kubenswrapper[4764]: I0320 15:27:35.686816 4764 generic.go:334] "Generic (PLEG): container finished" podID="c81117b3-6d55-444b-bed2-9b7eac23bf8e" containerID="3a771e67f4bc6ade8212dd8d6a372528e0b139fa5749afb7f9168869487e077d" exitCode=0 Mar 20 15:27:35 crc kubenswrapper[4764]: I0320 15:27:35.686944 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" event={"ID":"c81117b3-6d55-444b-bed2-9b7eac23bf8e","Type":"ContainerDied","Data":"3a771e67f4bc6ade8212dd8d6a372528e0b139fa5749afb7f9168869487e077d"} Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.122813 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.235214 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-ssh-key-openstack-edpm-ipam\") pod \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.235277 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-nova-metadata-neutron-config-0\") pod \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.235304 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-neutron-metadata-combined-ca-bundle\") pod \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.235346 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-inventory\") pod \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.235457 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.235492 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwhxz\" (UniqueName: \"kubernetes.io/projected/c81117b3-6d55-444b-bed2-9b7eac23bf8e-kube-api-access-vwhxz\") pod \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\" (UID: \"c81117b3-6d55-444b-bed2-9b7eac23bf8e\") " Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.241096 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c81117b3-6d55-444b-bed2-9b7eac23bf8e" (UID: "c81117b3-6d55-444b-bed2-9b7eac23bf8e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.244502 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c81117b3-6d55-444b-bed2-9b7eac23bf8e-kube-api-access-vwhxz" (OuterVolumeSpecName: "kube-api-access-vwhxz") pod "c81117b3-6d55-444b-bed2-9b7eac23bf8e" (UID: "c81117b3-6d55-444b-bed2-9b7eac23bf8e"). InnerVolumeSpecName "kube-api-access-vwhxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.264885 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c81117b3-6d55-444b-bed2-9b7eac23bf8e" (UID: "c81117b3-6d55-444b-bed2-9b7eac23bf8e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.265643 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c81117b3-6d55-444b-bed2-9b7eac23bf8e" (UID: "c81117b3-6d55-444b-bed2-9b7eac23bf8e"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.272781 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c81117b3-6d55-444b-bed2-9b7eac23bf8e" (UID: "c81117b3-6d55-444b-bed2-9b7eac23bf8e"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.300282 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-inventory" (OuterVolumeSpecName: "inventory") pod "c81117b3-6d55-444b-bed2-9b7eac23bf8e" (UID: "c81117b3-6d55-444b-bed2-9b7eac23bf8e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.338070 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.338114 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.338129 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.338147 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.338162 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c81117b3-6d55-444b-bed2-9b7eac23bf8e-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.338178 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwhxz\" (UniqueName: \"kubernetes.io/projected/c81117b3-6d55-444b-bed2-9b7eac23bf8e-kube-api-access-vwhxz\") on node \"crc\" DevicePath \"\"" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.708662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" event={"ID":"c81117b3-6d55-444b-bed2-9b7eac23bf8e","Type":"ContainerDied","Data":"475a094bf0775d0f751cc77150efb4defc5ea8e4e30800931e308770745b1f78"} Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.708701 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="475a094bf0775d0f751cc77150efb4defc5ea8e4e30800931e308770745b1f78" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.708889 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.783645 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz"] Mar 20 15:27:37 crc kubenswrapper[4764]: E0320 15:27:37.784007 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e39dea-d514-49af-9db0-29a5036609c4" containerName="extract-content" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.784021 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e39dea-d514-49af-9db0-29a5036609c4" containerName="extract-content" Mar 20 15:27:37 crc kubenswrapper[4764]: E0320 15:27:37.784051 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e39dea-d514-49af-9db0-29a5036609c4" containerName="registry-server" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.784057 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e39dea-d514-49af-9db0-29a5036609c4" containerName="registry-server" Mar 20 15:27:37 crc kubenswrapper[4764]: E0320 15:27:37.784073 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81117b3-6d55-444b-bed2-9b7eac23bf8e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.784082 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81117b3-6d55-444b-bed2-9b7eac23bf8e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 15:27:37 crc kubenswrapper[4764]: E0320 15:27:37.784093 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e39dea-d514-49af-9db0-29a5036609c4" containerName="extract-utilities" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.784105 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e39dea-d514-49af-9db0-29a5036609c4" containerName="extract-utilities" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.784267 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e39dea-d514-49af-9db0-29a5036609c4" containerName="registry-server" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.784283 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c81117b3-6d55-444b-bed2-9b7eac23bf8e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.784858 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.786646 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.790761 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.790825 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.791583 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.791841 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.794616 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz"] Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.948198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch8nl\" (UniqueName: \"kubernetes.io/projected/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-kube-api-access-ch8nl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.948292 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.948376 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.948441 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:37 crc kubenswrapper[4764]: I0320 15:27:37.948596 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.050553 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.050625 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch8nl\" (UniqueName: \"kubernetes.io/projected/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-kube-api-access-ch8nl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.050669 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.050699 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.050727 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.055021 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.055107 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.055835 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.056161 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.068562 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch8nl\" (UniqueName: \"kubernetes.io/projected/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-kube-api-access-ch8nl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.100271 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.635600 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz"] Mar 20 15:27:38 crc kubenswrapper[4764]: I0320 15:27:38.725417 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" event={"ID":"f80dffa9-1ef1-4046-854e-66dcbdff3c9c","Type":"ContainerStarted","Data":"f3d4b3266fc01758df161d114beaa7fdaf5aceba1d82af5f589bfa15850fa0fb"} Mar 20 15:27:39 crc kubenswrapper[4764]: I0320 15:27:39.735362 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" event={"ID":"f80dffa9-1ef1-4046-854e-66dcbdff3c9c","Type":"ContainerStarted","Data":"34659f024d7b84b21f05bf07fa3ad667d218d256d97f5add8b22583c5bb68bf1"} Mar 20 15:27:39 crc kubenswrapper[4764]: I0320 15:27:39.769776 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" podStartSLOduration=2.054613667 podStartE2EDuration="2.769748074s" podCreationTimestamp="2026-03-20 15:27:37 +0000 UTC" firstStartedPulling="2026-03-20 15:27:38.647669701 +0000 UTC m=+2180.263858830" lastFinishedPulling="2026-03-20 15:27:39.362804108 +0000 UTC m=+2180.978993237" observedRunningTime="2026-03-20 15:27:39.755724794 +0000 UTC m=+2181.371913953" watchObservedRunningTime="2026-03-20 15:27:39.769748074 +0000 UTC m=+2181.385937233" Mar 20 15:28:00 crc kubenswrapper[4764]: I0320 15:28:00.155426 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567008-7htpr"] Mar 20 15:28:00 crc kubenswrapper[4764]: I0320 15:28:00.157790 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567008-7htpr" Mar 20 15:28:00 crc kubenswrapper[4764]: I0320 15:28:00.160899 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:28:00 crc kubenswrapper[4764]: I0320 15:28:00.161235 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:28:00 crc kubenswrapper[4764]: I0320 15:28:00.183513 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:28:00 crc kubenswrapper[4764]: I0320 15:28:00.204350 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567008-7htpr"] Mar 20 15:28:00 crc kubenswrapper[4764]: I0320 15:28:00.308309 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gm7s\" (UniqueName: \"kubernetes.io/projected/03aa4307-ae38-4cb6-a30e-e42af94e2341-kube-api-access-9gm7s\") pod \"auto-csr-approver-29567008-7htpr\" (UID: \"03aa4307-ae38-4cb6-a30e-e42af94e2341\") " pod="openshift-infra/auto-csr-approver-29567008-7htpr" Mar 20 15:28:00 crc kubenswrapper[4764]: I0320 15:28:00.409823 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gm7s\" (UniqueName: \"kubernetes.io/projected/03aa4307-ae38-4cb6-a30e-e42af94e2341-kube-api-access-9gm7s\") pod \"auto-csr-approver-29567008-7htpr\" (UID: \"03aa4307-ae38-4cb6-a30e-e42af94e2341\") " pod="openshift-infra/auto-csr-approver-29567008-7htpr" Mar 20 15:28:00 crc kubenswrapper[4764]: I0320 15:28:00.429008 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gm7s\" (UniqueName: \"kubernetes.io/projected/03aa4307-ae38-4cb6-a30e-e42af94e2341-kube-api-access-9gm7s\") pod \"auto-csr-approver-29567008-7htpr\" (UID: \"03aa4307-ae38-4cb6-a30e-e42af94e2341\") " pod="openshift-infra/auto-csr-approver-29567008-7htpr" Mar 20 15:28:00 crc kubenswrapper[4764]: I0320 15:28:00.520456 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567008-7htpr" Mar 20 15:28:00 crc kubenswrapper[4764]: W0320 15:28:00.978198 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03aa4307_ae38_4cb6_a30e_e42af94e2341.slice/crio-e0f70e6a098ff044fce7c486eda5054e53f8f61e438d7d84bf4c5e3ca3169759 WatchSource:0}: Error finding container e0f70e6a098ff044fce7c486eda5054e53f8f61e438d7d84bf4c5e3ca3169759: Status 404 returned error can't find the container with id e0f70e6a098ff044fce7c486eda5054e53f8f61e438d7d84bf4c5e3ca3169759 Mar 20 15:28:00 crc kubenswrapper[4764]: I0320 15:28:00.980074 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567008-7htpr"] Mar 20 15:28:01 crc kubenswrapper[4764]: I0320 15:28:01.933672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567008-7htpr" event={"ID":"03aa4307-ae38-4cb6-a30e-e42af94e2341","Type":"ContainerStarted","Data":"e0f70e6a098ff044fce7c486eda5054e53f8f61e438d7d84bf4c5e3ca3169759"} Mar 20 15:28:04 crc kubenswrapper[4764]: I0320 15:28:04.964527 4764 generic.go:334] "Generic (PLEG): container finished" podID="03aa4307-ae38-4cb6-a30e-e42af94e2341" containerID="2a1293a0326344552b0456126d7c8e239fd2c61cc8c89171680628ab2d365db7" exitCode=0 Mar 20 15:28:04 crc kubenswrapper[4764]: I0320 15:28:04.965078 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567008-7htpr" event={"ID":"03aa4307-ae38-4cb6-a30e-e42af94e2341","Type":"ContainerDied","Data":"2a1293a0326344552b0456126d7c8e239fd2c61cc8c89171680628ab2d365db7"} Mar 20 15:28:06 crc kubenswrapper[4764]: I0320 15:28:06.337431 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567008-7htpr" Mar 20 15:28:06 crc kubenswrapper[4764]: I0320 15:28:06.530103 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gm7s\" (UniqueName: \"kubernetes.io/projected/03aa4307-ae38-4cb6-a30e-e42af94e2341-kube-api-access-9gm7s\") pod \"03aa4307-ae38-4cb6-a30e-e42af94e2341\" (UID: \"03aa4307-ae38-4cb6-a30e-e42af94e2341\") " Mar 20 15:28:06 crc kubenswrapper[4764]: I0320 15:28:06.536598 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03aa4307-ae38-4cb6-a30e-e42af94e2341-kube-api-access-9gm7s" (OuterVolumeSpecName: "kube-api-access-9gm7s") pod "03aa4307-ae38-4cb6-a30e-e42af94e2341" (UID: "03aa4307-ae38-4cb6-a30e-e42af94e2341"). InnerVolumeSpecName "kube-api-access-9gm7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:28:06 crc kubenswrapper[4764]: I0320 15:28:06.631974 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gm7s\" (UniqueName: \"kubernetes.io/projected/03aa4307-ae38-4cb6-a30e-e42af94e2341-kube-api-access-9gm7s\") on node \"crc\" DevicePath \"\"" Mar 20 15:28:06 crc kubenswrapper[4764]: I0320 15:28:06.985885 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567008-7htpr" event={"ID":"03aa4307-ae38-4cb6-a30e-e42af94e2341","Type":"ContainerDied","Data":"e0f70e6a098ff044fce7c486eda5054e53f8f61e438d7d84bf4c5e3ca3169759"} Mar 20 15:28:06 crc kubenswrapper[4764]: I0320 15:28:06.985939 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0f70e6a098ff044fce7c486eda5054e53f8f61e438d7d84bf4c5e3ca3169759" Mar 20 15:28:06 crc kubenswrapper[4764]: I0320 15:28:06.985980 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567008-7htpr" Mar 20 15:28:07 crc kubenswrapper[4764]: I0320 15:28:07.418508 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567002-nvm8f"] Mar 20 15:28:07 crc kubenswrapper[4764]: I0320 15:28:07.428738 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567002-nvm8f"] Mar 20 15:28:08 crc kubenswrapper[4764]: I0320 15:28:08.443808 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:28:08 crc kubenswrapper[4764]: I0320 15:28:08.443873 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:28:09 crc kubenswrapper[4764]: I0320 15:28:09.138582 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f06100-41b2-4f66-8e59-3ccb3cd7c38b" path="/var/lib/kubelet/pods/46f06100-41b2-4f66-8e59-3ccb3cd7c38b/volumes" Mar 20 15:28:09 crc kubenswrapper[4764]: I0320 15:28:09.459431 4764 scope.go:117] "RemoveContainer" containerID="3e3d42a74df62b1b836aacb23569107418c862474e9e8d6d8a31123f17e0de5a" Mar 20 15:28:38 crc kubenswrapper[4764]: I0320 15:28:38.443720 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:28:38 crc kubenswrapper[4764]: I0320 15:28:38.446785 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:29:08 crc kubenswrapper[4764]: I0320 15:29:08.443098 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:29:08 crc kubenswrapper[4764]: I0320 15:29:08.443810 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:29:08 crc kubenswrapper[4764]: I0320 15:29:08.443869 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 15:29:08 crc kubenswrapper[4764]: I0320 15:29:08.445021 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:29:08 crc kubenswrapper[4764]: I0320 15:29:08.445104 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" gracePeriod=600 Mar 20 15:29:08 crc kubenswrapper[4764]: E0320 15:29:08.637921 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:29:08 crc kubenswrapper[4764]: I0320 15:29:08.677677 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" exitCode=0 Mar 20 15:29:08 crc kubenswrapper[4764]: I0320 15:29:08.677740 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5"} Mar 20 15:29:08 crc kubenswrapper[4764]: I0320 15:29:08.677983 4764 scope.go:117] "RemoveContainer" containerID="682d34616f4f512a04218ac49f94a47dab3b53e7ea4a05247eddd63a6d004a5c" Mar 20 15:29:08 crc kubenswrapper[4764]: I0320 15:29:08.678519 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:29:08 crc kubenswrapper[4764]: E0320 15:29:08.679037 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:29:23 crc kubenswrapper[4764]: I0320 15:29:23.126816 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:29:23 crc kubenswrapper[4764]: E0320 15:29:23.127923 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:29:36 crc kubenswrapper[4764]: I0320 15:29:36.126044 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:29:36 crc kubenswrapper[4764]: E0320 15:29:36.126903 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:29:50 crc kubenswrapper[4764]: I0320 15:29:50.125917 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:29:50 crc kubenswrapper[4764]: E0320 15:29:50.126851 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.157725 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567010-x2ddf"] Mar 20 15:30:00 crc kubenswrapper[4764]: E0320 15:30:00.159785 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03aa4307-ae38-4cb6-a30e-e42af94e2341" containerName="oc" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.159809 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="03aa4307-ae38-4cb6-a30e-e42af94e2341" containerName="oc" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.160031 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="03aa4307-ae38-4cb6-a30e-e42af94e2341" containerName="oc" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.160870 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567010-x2ddf" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.169435 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.169984 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.170232 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.172182 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567010-x2ddf"] Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.189092 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l"] Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.191011 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.193720 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.194150 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.233688 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l"] Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.301353 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt455\" (UniqueName: \"kubernetes.io/projected/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-kube-api-access-lt455\") pod \"collect-profiles-29567010-9xj9l\" (UID: \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.301500 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-secret-volume\") pod \"collect-profiles-29567010-9xj9l\" (UID: \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.301554 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-config-volume\") pod \"collect-profiles-29567010-9xj9l\" (UID: \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.301596 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2pnk\" (UniqueName: \"kubernetes.io/projected/94b192c7-fa74-49a4-903b-ec0fa1a86ddd-kube-api-access-x2pnk\") pod \"auto-csr-approver-29567010-x2ddf\" (UID: \"94b192c7-fa74-49a4-903b-ec0fa1a86ddd\") " pod="openshift-infra/auto-csr-approver-29567010-x2ddf" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.403006 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt455\" (UniqueName: \"kubernetes.io/projected/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-kube-api-access-lt455\") pod \"collect-profiles-29567010-9xj9l\" (UID: \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.403067 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-secret-volume\") pod \"collect-profiles-29567010-9xj9l\" (UID: \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.403109 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-config-volume\") pod \"collect-profiles-29567010-9xj9l\" (UID: \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.403153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2pnk\" (UniqueName: \"kubernetes.io/projected/94b192c7-fa74-49a4-903b-ec0fa1a86ddd-kube-api-access-x2pnk\") pod \"auto-csr-approver-29567010-x2ddf\" (UID: \"94b192c7-fa74-49a4-903b-ec0fa1a86ddd\") " pod="openshift-infra/auto-csr-approver-29567010-x2ddf" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.404269 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-config-volume\") pod \"collect-profiles-29567010-9xj9l\" (UID: \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.409497 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-secret-volume\") pod \"collect-profiles-29567010-9xj9l\" (UID: \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.421887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt455\" (UniqueName: \"kubernetes.io/projected/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-kube-api-access-lt455\") pod \"collect-profiles-29567010-9xj9l\" (UID: \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.422572 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2pnk\" (UniqueName: \"kubernetes.io/projected/94b192c7-fa74-49a4-903b-ec0fa1a86ddd-kube-api-access-x2pnk\") pod \"auto-csr-approver-29567010-x2ddf\" (UID: \"94b192c7-fa74-49a4-903b-ec0fa1a86ddd\") " pod="openshift-infra/auto-csr-approver-29567010-x2ddf" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.483765 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567010-x2ddf" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.528255 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:00 crc kubenswrapper[4764]: I0320 15:30:00.971055 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567010-x2ddf"] Mar 20 15:30:01 crc kubenswrapper[4764]: I0320 15:30:01.043357 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l"] Mar 20 15:30:01 crc kubenswrapper[4764]: W0320 15:30:01.046069 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf3adadc_fc1f_4af2_9ccd_a0c064a0dc83.slice/crio-91ba8e87c60613168d907ee1ce5225d392428e1470f3f3a640aba21827ddd04d WatchSource:0}: Error finding container 91ba8e87c60613168d907ee1ce5225d392428e1470f3f3a640aba21827ddd04d: Status 404 returned error can't find the container with id 91ba8e87c60613168d907ee1ce5225d392428e1470f3f3a640aba21827ddd04d Mar 20 15:30:01 crc kubenswrapper[4764]: I0320 15:30:01.240369 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567010-x2ddf" event={"ID":"94b192c7-fa74-49a4-903b-ec0fa1a86ddd","Type":"ContainerStarted","Data":"700a66ae70f6166b6034341e64105d2e1a8f54ce377296ffb25a62461e5abc3a"} Mar 20 15:30:01 crc kubenswrapper[4764]: I0320 15:30:01.241658 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" event={"ID":"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83","Type":"ContainerStarted","Data":"91ba8e87c60613168d907ee1ce5225d392428e1470f3f3a640aba21827ddd04d"} Mar 20 15:30:02 crc kubenswrapper[4764]: I0320 15:30:02.251740 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83" containerID="31cd0af3a7d5a7e77a18397b121096a62b6d8e69c95ad0f47567627e8030f8f1" exitCode=0 Mar 20 15:30:02 crc kubenswrapper[4764]: I0320 15:30:02.251818 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" event={"ID":"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83","Type":"ContainerDied","Data":"31cd0af3a7d5a7e77a18397b121096a62b6d8e69c95ad0f47567627e8030f8f1"} Mar 20 15:30:03 crc kubenswrapper[4764]: I0320 15:30:03.264423 4764 generic.go:334] "Generic (PLEG): container finished" podID="94b192c7-fa74-49a4-903b-ec0fa1a86ddd" containerID="86aec7e13e182c340a9d434b3877ae43a85e119549f65b24aae0ebafb5e50d7f" exitCode=0 Mar 20 15:30:03 crc kubenswrapper[4764]: I0320 15:30:03.264512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567010-x2ddf" event={"ID":"94b192c7-fa74-49a4-903b-ec0fa1a86ddd","Type":"ContainerDied","Data":"86aec7e13e182c340a9d434b3877ae43a85e119549f65b24aae0ebafb5e50d7f"} Mar 20 15:30:03 crc kubenswrapper[4764]: I0320 15:30:03.614508 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:03 crc kubenswrapper[4764]: I0320 15:30:03.696128 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-config-volume\") pod \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\" (UID: \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\") " Mar 20 15:30:03 crc kubenswrapper[4764]: I0320 15:30:03.696276 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-secret-volume\") pod \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\" (UID: \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\") " Mar 20 15:30:03 crc kubenswrapper[4764]: I0320 15:30:03.696346 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt455\" (UniqueName: \"kubernetes.io/projected/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-kube-api-access-lt455\") pod \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\" (UID: \"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83\") " Mar 20 15:30:03 crc kubenswrapper[4764]: I0320 15:30:03.697943 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-config-volume" (OuterVolumeSpecName: "config-volume") pod "cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83" (UID: "cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:30:03 crc kubenswrapper[4764]: I0320 15:30:03.706043 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83" (UID: "cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:30:03 crc kubenswrapper[4764]: I0320 15:30:03.714109 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-kube-api-access-lt455" (OuterVolumeSpecName: "kube-api-access-lt455") pod "cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83" (UID: "cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83"). InnerVolumeSpecName "kube-api-access-lt455". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:30:03 crc kubenswrapper[4764]: I0320 15:30:03.799186 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:03 crc kubenswrapper[4764]: I0320 15:30:03.799241 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt455\" (UniqueName: \"kubernetes.io/projected/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-kube-api-access-lt455\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:03 crc kubenswrapper[4764]: I0320 15:30:03.799254 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:04 crc kubenswrapper[4764]: I0320 15:30:04.127074 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:30:04 crc kubenswrapper[4764]: E0320 15:30:04.127685 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:30:04 crc kubenswrapper[4764]: I0320 15:30:04.275743 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" Mar 20 15:30:04 crc kubenswrapper[4764]: I0320 15:30:04.282528 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567010-9xj9l" event={"ID":"cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83","Type":"ContainerDied","Data":"91ba8e87c60613168d907ee1ce5225d392428e1470f3f3a640aba21827ddd04d"} Mar 20 15:30:04 crc kubenswrapper[4764]: I0320 15:30:04.282621 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91ba8e87c60613168d907ee1ce5225d392428e1470f3f3a640aba21827ddd04d" Mar 20 15:30:05 crc kubenswrapper[4764]: I0320 15:30:04.660965 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567010-x2ddf" Mar 20 15:30:05 crc kubenswrapper[4764]: I0320 15:30:04.703840 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4"] Mar 20 15:30:05 crc kubenswrapper[4764]: I0320 15:30:04.711208 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-5zjk4"] Mar 20 15:30:05 crc kubenswrapper[4764]: I0320 15:30:04.729766 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2pnk\" (UniqueName: \"kubernetes.io/projected/94b192c7-fa74-49a4-903b-ec0fa1a86ddd-kube-api-access-x2pnk\") pod \"94b192c7-fa74-49a4-903b-ec0fa1a86ddd\" (UID: \"94b192c7-fa74-49a4-903b-ec0fa1a86ddd\") " Mar 20 15:30:05 crc kubenswrapper[4764]: I0320 15:30:04.734221 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b192c7-fa74-49a4-903b-ec0fa1a86ddd-kube-api-access-x2pnk" (OuterVolumeSpecName: "kube-api-access-x2pnk") pod "94b192c7-fa74-49a4-903b-ec0fa1a86ddd" (UID: "94b192c7-fa74-49a4-903b-ec0fa1a86ddd"). InnerVolumeSpecName "kube-api-access-x2pnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:30:05 crc kubenswrapper[4764]: I0320 15:30:04.832438 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2pnk\" (UniqueName: \"kubernetes.io/projected/94b192c7-fa74-49a4-903b-ec0fa1a86ddd-kube-api-access-x2pnk\") on node \"crc\" DevicePath \"\"" Mar 20 15:30:05 crc kubenswrapper[4764]: I0320 15:30:05.140041 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a20779e-1d3a-4c81-86c7-3248b50c8118" path="/var/lib/kubelet/pods/1a20779e-1d3a-4c81-86c7-3248b50c8118/volumes" Mar 20 15:30:05 crc kubenswrapper[4764]: I0320 15:30:05.284533 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567010-x2ddf" event={"ID":"94b192c7-fa74-49a4-903b-ec0fa1a86ddd","Type":"ContainerDied","Data":"700a66ae70f6166b6034341e64105d2e1a8f54ce377296ffb25a62461e5abc3a"} Mar 20 15:30:05 crc kubenswrapper[4764]: I0320 15:30:05.284570 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="700a66ae70f6166b6034341e64105d2e1a8f54ce377296ffb25a62461e5abc3a" Mar 20 15:30:05 crc kubenswrapper[4764]: I0320 15:30:05.284625 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567010-x2ddf" Mar 20 15:30:05 crc kubenswrapper[4764]: I0320 15:30:05.734222 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567004-26g8t"] Mar 20 15:30:05 crc kubenswrapper[4764]: I0320 15:30:05.742955 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567004-26g8t"] Mar 20 15:30:07 crc kubenswrapper[4764]: I0320 15:30:07.137239 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b7fc759-c1c6-46ec-ab74-fb9f2546f661" path="/var/lib/kubelet/pods/5b7fc759-c1c6-46ec-ab74-fb9f2546f661/volumes" Mar 20 15:30:09 crc kubenswrapper[4764]: I0320 15:30:09.608187 4764 scope.go:117] "RemoveContainer" containerID="e05b03a6e9dfc5273b2d8b8f7795b765fe8ecc2344eed5840ae227d5faea20ed" Mar 20 15:30:09 crc kubenswrapper[4764]: I0320 15:30:09.640711 4764 scope.go:117] "RemoveContainer" containerID="5b9c2245ce56a346a03bb7d2c359fcf5d89f12ee63b70f87c29dcb9a1fd9e7f8" Mar 20 15:30:18 crc kubenswrapper[4764]: I0320 15:30:18.128097 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:30:18 crc kubenswrapper[4764]: E0320 15:30:18.129095 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:30:33 crc kubenswrapper[4764]: I0320 15:30:33.127498 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:30:33 crc kubenswrapper[4764]: E0320 15:30:33.128280 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:30:45 crc kubenswrapper[4764]: I0320 15:30:45.126330 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:30:45 crc kubenswrapper[4764]: E0320 15:30:45.127446 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:30:56 crc kubenswrapper[4764]: I0320 15:30:56.127672 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:30:56 crc kubenswrapper[4764]: E0320 15:30:56.128510 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:31:07 crc kubenswrapper[4764]: I0320 15:31:07.126646 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:31:07 crc kubenswrapper[4764]: E0320 15:31:07.128730 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:31:19 crc kubenswrapper[4764]: I0320 15:31:19.134461 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:31:19 crc kubenswrapper[4764]: E0320 15:31:19.135255 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:31:30 crc kubenswrapper[4764]: I0320 15:31:30.127703 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:31:30 crc kubenswrapper[4764]: E0320 15:31:30.129219 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:31:34 crc kubenswrapper[4764]: I0320 15:31:34.144899 4764 generic.go:334] "Generic (PLEG): container finished" podID="f80dffa9-1ef1-4046-854e-66dcbdff3c9c" containerID="34659f024d7b84b21f05bf07fa3ad667d218d256d97f5add8b22583c5bb68bf1" exitCode=0 Mar 20 15:31:34 crc kubenswrapper[4764]: I0320 15:31:34.144983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" event={"ID":"f80dffa9-1ef1-4046-854e-66dcbdff3c9c","Type":"ContainerDied","Data":"34659f024d7b84b21f05bf07fa3ad667d218d256d97f5add8b22583c5bb68bf1"} Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.640525 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.794042 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-ssh-key-openstack-edpm-ipam\") pod \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.794156 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-libvirt-secret-0\") pod \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.794220 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch8nl\" (UniqueName: \"kubernetes.io/projected/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-kube-api-access-ch8nl\") pod \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.794244 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-inventory\") pod \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.794288 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-libvirt-combined-ca-bundle\") pod \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\" (UID: \"f80dffa9-1ef1-4046-854e-66dcbdff3c9c\") " Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.800781 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-kube-api-access-ch8nl" (OuterVolumeSpecName: "kube-api-access-ch8nl") pod "f80dffa9-1ef1-4046-854e-66dcbdff3c9c" (UID: "f80dffa9-1ef1-4046-854e-66dcbdff3c9c"). InnerVolumeSpecName "kube-api-access-ch8nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.815621 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f80dffa9-1ef1-4046-854e-66dcbdff3c9c" (UID: "f80dffa9-1ef1-4046-854e-66dcbdff3c9c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.819747 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-inventory" (OuterVolumeSpecName: "inventory") pod "f80dffa9-1ef1-4046-854e-66dcbdff3c9c" (UID: "f80dffa9-1ef1-4046-854e-66dcbdff3c9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.821575 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f80dffa9-1ef1-4046-854e-66dcbdff3c9c" (UID: "f80dffa9-1ef1-4046-854e-66dcbdff3c9c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.822861 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "f80dffa9-1ef1-4046-854e-66dcbdff3c9c" (UID: "f80dffa9-1ef1-4046-854e-66dcbdff3c9c"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.896126 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch8nl\" (UniqueName: \"kubernetes.io/projected/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-kube-api-access-ch8nl\") on node \"crc\" DevicePath \"\"" Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.896159 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.896171 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.896181 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:31:35 crc kubenswrapper[4764]: I0320 15:31:35.896193 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f80dffa9-1ef1-4046-854e-66dcbdff3c9c-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.166516 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" event={"ID":"f80dffa9-1ef1-4046-854e-66dcbdff3c9c","Type":"ContainerDied","Data":"f3d4b3266fc01758df161d114beaa7fdaf5aceba1d82af5f589bfa15850fa0fb"} Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.166548 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3d4b3266fc01758df161d114beaa7fdaf5aceba1d82af5f589bfa15850fa0fb" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.166593 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.264331 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2"] Mar 20 15:31:36 crc kubenswrapper[4764]: E0320 15:31:36.264796 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80dffa9-1ef1-4046-854e-66dcbdff3c9c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.264819 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80dffa9-1ef1-4046-854e-66dcbdff3c9c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 15:31:36 crc kubenswrapper[4764]: E0320 15:31:36.264832 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83" containerName="collect-profiles" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.264840 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83" containerName="collect-profiles" Mar 20 15:31:36 crc kubenswrapper[4764]: E0320 15:31:36.264870 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b192c7-fa74-49a4-903b-ec0fa1a86ddd" containerName="oc" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.264878 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b192c7-fa74-49a4-903b-ec0fa1a86ddd" containerName="oc" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.265076 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80dffa9-1ef1-4046-854e-66dcbdff3c9c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.265108 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b192c7-fa74-49a4-903b-ec0fa1a86ddd" containerName="oc" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.265121 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf3adadc-fc1f-4af2-9ccd-a0c064a0dc83" containerName="collect-profiles" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.266157 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.269830 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.270090 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.269842 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.270336 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.269891 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.270519 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.269943 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.276890 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2"] Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.315905 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.315954 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.315986 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.316010 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.316094 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.316152 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.316185 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.316243 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.316309 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxtk5\" (UniqueName: \"kubernetes.io/projected/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-kube-api-access-rxtk5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.316409 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.316458 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.417729 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.417783 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.417811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.417829 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.417862 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.417898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxtk5\" (UniqueName: \"kubernetes.io/projected/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-kube-api-access-rxtk5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.417941 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.417970 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.417999 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.418014 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.418046 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.419368 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.422918 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.422964 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.423069 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.423199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.424275 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.424546 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.424617 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.425812 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.434242 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.437139 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxtk5\" (UniqueName: \"kubernetes.io/projected/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-kube-api-access-rxtk5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-pxgf2\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:36 crc kubenswrapper[4764]: I0320 15:31:36.615764 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:31:37 crc kubenswrapper[4764]: I0320 15:31:37.230664 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2"] Mar 20 15:31:38 crc kubenswrapper[4764]: I0320 15:31:38.188730 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" event={"ID":"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d","Type":"ContainerStarted","Data":"9a98e954acd0cd56bbc43e32d0338cf364c6bfdb51ff639486ba65e902b6cec9"} Mar 20 15:31:38 crc kubenswrapper[4764]: I0320 15:31:38.189328 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" event={"ID":"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d","Type":"ContainerStarted","Data":"0be69503e7fb981eb90a5b86973eb5290baaa9057600ac5585aa6b2ce30746d9"} Mar 20 15:31:38 crc kubenswrapper[4764]: I0320 15:31:38.216121 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" podStartSLOduration=1.7547259990000001 podStartE2EDuration="2.216099076s" podCreationTimestamp="2026-03-20 15:31:36 +0000 UTC" firstStartedPulling="2026-03-20 15:31:37.232465822 +0000 UTC m=+2418.848654971" lastFinishedPulling="2026-03-20 15:31:37.693838909 +0000 UTC m=+2419.310028048" observedRunningTime="2026-03-20 15:31:38.210619967 +0000 UTC m=+2419.826809136" watchObservedRunningTime="2026-03-20 15:31:38.216099076 +0000 UTC m=+2419.832288215" Mar 20 15:31:42 crc kubenswrapper[4764]: I0320 15:31:42.948701 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hlk24"] Mar 20 15:31:42 crc kubenswrapper[4764]: I0320 15:31:42.952226 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:42 crc kubenswrapper[4764]: I0320 15:31:42.973733 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hlk24"] Mar 20 15:31:43 crc kubenswrapper[4764]: I0320 15:31:43.057961 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9cef90-3616-4b7b-9600-f2c8f83c3177-catalog-content\") pod \"certified-operators-hlk24\" (UID: \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\") " pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:43 crc kubenswrapper[4764]: I0320 15:31:43.058104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9cef90-3616-4b7b-9600-f2c8f83c3177-utilities\") pod \"certified-operators-hlk24\" (UID: \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\") " pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:43 crc kubenswrapper[4764]: I0320 15:31:43.058152 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f7xn\" (UniqueName: \"kubernetes.io/projected/7b9cef90-3616-4b7b-9600-f2c8f83c3177-kube-api-access-4f7xn\") pod \"certified-operators-hlk24\" (UID: \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\") " pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:43 crc kubenswrapper[4764]: I0320 15:31:43.126908 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:31:43 crc kubenswrapper[4764]: E0320 15:31:43.127169 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:31:43 crc kubenswrapper[4764]: I0320 15:31:43.159679 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9cef90-3616-4b7b-9600-f2c8f83c3177-utilities\") pod \"certified-operators-hlk24\" (UID: \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\") " pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:43 crc kubenswrapper[4764]: I0320 15:31:43.159798 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f7xn\" (UniqueName: \"kubernetes.io/projected/7b9cef90-3616-4b7b-9600-f2c8f83c3177-kube-api-access-4f7xn\") pod \"certified-operators-hlk24\" (UID: \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\") " pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:43 crc kubenswrapper[4764]: I0320 15:31:43.159965 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9cef90-3616-4b7b-9600-f2c8f83c3177-catalog-content\") pod \"certified-operators-hlk24\" (UID: \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\") " pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:43 crc kubenswrapper[4764]: I0320 15:31:43.160362 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9cef90-3616-4b7b-9600-f2c8f83c3177-utilities\") pod \"certified-operators-hlk24\" (UID: \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\") " pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:43 crc kubenswrapper[4764]: I0320 15:31:43.160475 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9cef90-3616-4b7b-9600-f2c8f83c3177-catalog-content\") pod \"certified-operators-hlk24\" (UID: \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\") " pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:43 crc kubenswrapper[4764]: I0320 15:31:43.194888 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f7xn\" (UniqueName: \"kubernetes.io/projected/7b9cef90-3616-4b7b-9600-f2c8f83c3177-kube-api-access-4f7xn\") pod \"certified-operators-hlk24\" (UID: \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\") " pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:43 crc kubenswrapper[4764]: I0320 15:31:43.279609 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:43 crc kubenswrapper[4764]: I0320 15:31:43.798750 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hlk24"] Mar 20 15:31:44 crc kubenswrapper[4764]: I0320 15:31:44.259582 4764 generic.go:334] "Generic (PLEG): container finished" podID="7b9cef90-3616-4b7b-9600-f2c8f83c3177" containerID="fd124d99d31ba145c15977ab619a740ea27853f5c96bb475890f91d3fb91f87e" exitCode=0 Mar 20 15:31:44 crc kubenswrapper[4764]: I0320 15:31:44.259633 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlk24" event={"ID":"7b9cef90-3616-4b7b-9600-f2c8f83c3177","Type":"ContainerDied","Data":"fd124d99d31ba145c15977ab619a740ea27853f5c96bb475890f91d3fb91f87e"} Mar 20 15:31:44 crc kubenswrapper[4764]: I0320 15:31:44.259667 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlk24" event={"ID":"7b9cef90-3616-4b7b-9600-f2c8f83c3177","Type":"ContainerStarted","Data":"9b44c7fe261c90241d78426b3223b2a770bdc4fdfdaa857342f4c238d6c6df8b"} Mar 20 15:31:45 crc kubenswrapper[4764]: I0320 15:31:45.272489 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlk24" event={"ID":"7b9cef90-3616-4b7b-9600-f2c8f83c3177","Type":"ContainerStarted","Data":"dcacb61204a78714bbd543b604b4a0c250929915abd45202d098878ab79e21d4"} Mar 20 15:31:46 crc kubenswrapper[4764]: I0320 15:31:46.298160 4764 generic.go:334] "Generic (PLEG): container finished" podID="7b9cef90-3616-4b7b-9600-f2c8f83c3177" containerID="dcacb61204a78714bbd543b604b4a0c250929915abd45202d098878ab79e21d4" exitCode=0 Mar 20 15:31:46 crc kubenswrapper[4764]: I0320 15:31:46.298233 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlk24" event={"ID":"7b9cef90-3616-4b7b-9600-f2c8f83c3177","Type":"ContainerDied","Data":"dcacb61204a78714bbd543b604b4a0c250929915abd45202d098878ab79e21d4"} Mar 20 15:31:46 crc kubenswrapper[4764]: I0320 15:31:46.301806 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:31:47 crc kubenswrapper[4764]: I0320 15:31:47.321178 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlk24" event={"ID":"7b9cef90-3616-4b7b-9600-f2c8f83c3177","Type":"ContainerStarted","Data":"d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3"} Mar 20 15:31:53 crc kubenswrapper[4764]: I0320 15:31:53.279919 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:53 crc kubenswrapper[4764]: I0320 15:31:53.280601 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:53 crc kubenswrapper[4764]: I0320 15:31:53.321112 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:53 crc kubenswrapper[4764]: I0320 15:31:53.340361 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hlk24" podStartSLOduration=8.619857534 podStartE2EDuration="11.340345745s" podCreationTimestamp="2026-03-20 15:31:42 +0000 UTC" firstStartedPulling="2026-03-20 15:31:44.26226378 +0000 UTC m=+2425.878452909" lastFinishedPulling="2026-03-20 15:31:46.982751991 +0000 UTC m=+2428.598941120" observedRunningTime="2026-03-20 15:31:47.342666027 +0000 UTC m=+2428.958855186" watchObservedRunningTime="2026-03-20 15:31:53.340345745 +0000 UTC m=+2434.956534874" Mar 20 15:31:53 crc kubenswrapper[4764]: I0320 15:31:53.436652 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:53 crc kubenswrapper[4764]: I0320 15:31:53.557181 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hlk24"] Mar 20 15:31:55 crc kubenswrapper[4764]: I0320 15:31:55.392228 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hlk24" podUID="7b9cef90-3616-4b7b-9600-f2c8f83c3177" containerName="registry-server" containerID="cri-o://d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3" gracePeriod=2 Mar 20 15:31:55 crc kubenswrapper[4764]: I0320 15:31:55.833351 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:55 crc kubenswrapper[4764]: I0320 15:31:55.866728 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9cef90-3616-4b7b-9600-f2c8f83c3177-catalog-content\") pod \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\" (UID: \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\") " Mar 20 15:31:55 crc kubenswrapper[4764]: I0320 15:31:55.866975 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9cef90-3616-4b7b-9600-f2c8f83c3177-utilities\") pod \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\" (UID: \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\") " Mar 20 15:31:55 crc kubenswrapper[4764]: I0320 15:31:55.867005 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f7xn\" (UniqueName: \"kubernetes.io/projected/7b9cef90-3616-4b7b-9600-f2c8f83c3177-kube-api-access-4f7xn\") pod \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\" (UID: \"7b9cef90-3616-4b7b-9600-f2c8f83c3177\") " Mar 20 15:31:55 crc kubenswrapper[4764]: I0320 15:31:55.867842 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b9cef90-3616-4b7b-9600-f2c8f83c3177-utilities" (OuterVolumeSpecName: "utilities") pod "7b9cef90-3616-4b7b-9600-f2c8f83c3177" (UID: "7b9cef90-3616-4b7b-9600-f2c8f83c3177"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:31:55 crc kubenswrapper[4764]: I0320 15:31:55.875141 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9cef90-3616-4b7b-9600-f2c8f83c3177-kube-api-access-4f7xn" (OuterVolumeSpecName: "kube-api-access-4f7xn") pod "7b9cef90-3616-4b7b-9600-f2c8f83c3177" (UID: "7b9cef90-3616-4b7b-9600-f2c8f83c3177"). InnerVolumeSpecName "kube-api-access-4f7xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:31:55 crc kubenswrapper[4764]: I0320 15:31:55.934435 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b9cef90-3616-4b7b-9600-f2c8f83c3177-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b9cef90-3616-4b7b-9600-f2c8f83c3177" (UID: "7b9cef90-3616-4b7b-9600-f2c8f83c3177"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:31:55 crc kubenswrapper[4764]: I0320 15:31:55.968729 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b9cef90-3616-4b7b-9600-f2c8f83c3177-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:31:55 crc kubenswrapper[4764]: I0320 15:31:55.968761 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f7xn\" (UniqueName: \"kubernetes.io/projected/7b9cef90-3616-4b7b-9600-f2c8f83c3177-kube-api-access-4f7xn\") on node \"crc\" DevicePath \"\"" Mar 20 15:31:55 crc kubenswrapper[4764]: I0320 15:31:55.968773 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b9cef90-3616-4b7b-9600-f2c8f83c3177-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.126575 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:31:56 crc kubenswrapper[4764]: E0320 15:31:56.127015 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.406272 4764 generic.go:334] "Generic (PLEG): container finished" podID="7b9cef90-3616-4b7b-9600-f2c8f83c3177" containerID="d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3" exitCode=0 Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.406312 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlk24" event={"ID":"7b9cef90-3616-4b7b-9600-f2c8f83c3177","Type":"ContainerDied","Data":"d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3"} Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.406335 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hlk24" event={"ID":"7b9cef90-3616-4b7b-9600-f2c8f83c3177","Type":"ContainerDied","Data":"9b44c7fe261c90241d78426b3223b2a770bdc4fdfdaa857342f4c238d6c6df8b"} Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.406353 4764 scope.go:117] "RemoveContainer" containerID="d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3" Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.406483 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hlk24" Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.435221 4764 scope.go:117] "RemoveContainer" containerID="dcacb61204a78714bbd543b604b4a0c250929915abd45202d098878ab79e21d4" Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.442889 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hlk24"] Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.454706 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hlk24"] Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.467256 4764 scope.go:117] "RemoveContainer" containerID="fd124d99d31ba145c15977ab619a740ea27853f5c96bb475890f91d3fb91f87e" Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.500266 4764 scope.go:117] "RemoveContainer" containerID="d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3" Mar 20 15:31:56 crc kubenswrapper[4764]: E0320 15:31:56.504498 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3\": container with ID starting with d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3 not found: ID does not exist" containerID="d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3" Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.504544 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3"} err="failed to get container status \"d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3\": rpc error: code = NotFound desc = could not find container \"d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3\": container with ID starting with d651178f26b12a9d4a88303a967d80a277bd99d5c797cbc9605d493c97f295e3 not found: ID does not exist" Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.504567 4764 scope.go:117] "RemoveContainer" containerID="dcacb61204a78714bbd543b604b4a0c250929915abd45202d098878ab79e21d4" Mar 20 15:31:56 crc kubenswrapper[4764]: E0320 15:31:56.505063 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcacb61204a78714bbd543b604b4a0c250929915abd45202d098878ab79e21d4\": container with ID starting with dcacb61204a78714bbd543b604b4a0c250929915abd45202d098878ab79e21d4 not found: ID does not exist" containerID="dcacb61204a78714bbd543b604b4a0c250929915abd45202d098878ab79e21d4" Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.505087 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcacb61204a78714bbd543b604b4a0c250929915abd45202d098878ab79e21d4"} err="failed to get container status \"dcacb61204a78714bbd543b604b4a0c250929915abd45202d098878ab79e21d4\": rpc error: code = NotFound desc = could not find container \"dcacb61204a78714bbd543b604b4a0c250929915abd45202d098878ab79e21d4\": container with ID starting with dcacb61204a78714bbd543b604b4a0c250929915abd45202d098878ab79e21d4 not found: ID does not exist" Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.505107 4764 scope.go:117] "RemoveContainer" containerID="fd124d99d31ba145c15977ab619a740ea27853f5c96bb475890f91d3fb91f87e" Mar 20 15:31:56 crc kubenswrapper[4764]: E0320 15:31:56.508488 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd124d99d31ba145c15977ab619a740ea27853f5c96bb475890f91d3fb91f87e\": container with ID starting with fd124d99d31ba145c15977ab619a740ea27853f5c96bb475890f91d3fb91f87e not found: ID does not exist" containerID="fd124d99d31ba145c15977ab619a740ea27853f5c96bb475890f91d3fb91f87e" Mar 20 15:31:56 crc kubenswrapper[4764]: I0320 15:31:56.508531 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd124d99d31ba145c15977ab619a740ea27853f5c96bb475890f91d3fb91f87e"} err="failed to get container status \"fd124d99d31ba145c15977ab619a740ea27853f5c96bb475890f91d3fb91f87e\": rpc error: code = NotFound desc = could not find container \"fd124d99d31ba145c15977ab619a740ea27853f5c96bb475890f91d3fb91f87e\": container with ID starting with fd124d99d31ba145c15977ab619a740ea27853f5c96bb475890f91d3fb91f87e not found: ID does not exist" Mar 20 15:31:57 crc kubenswrapper[4764]: I0320 15:31:57.140804 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9cef90-3616-4b7b-9600-f2c8f83c3177" path="/var/lib/kubelet/pods/7b9cef90-3616-4b7b-9600-f2c8f83c3177/volumes" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.164083 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567012-zn55k"] Mar 20 15:32:00 crc kubenswrapper[4764]: E0320 15:32:00.165993 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9cef90-3616-4b7b-9600-f2c8f83c3177" containerName="registry-server" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.166080 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9cef90-3616-4b7b-9600-f2c8f83c3177" containerName="registry-server" Mar 20 15:32:00 crc kubenswrapper[4764]: E0320 15:32:00.166144 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9cef90-3616-4b7b-9600-f2c8f83c3177" containerName="extract-utilities" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.166200 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9cef90-3616-4b7b-9600-f2c8f83c3177" containerName="extract-utilities" Mar 20 15:32:00 crc kubenswrapper[4764]: E0320 15:32:00.166261 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9cef90-3616-4b7b-9600-f2c8f83c3177" containerName="extract-content" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.166306 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9cef90-3616-4b7b-9600-f2c8f83c3177" containerName="extract-content" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.166532 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9cef90-3616-4b7b-9600-f2c8f83c3177" containerName="registry-server" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.167269 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567012-zn55k" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.174653 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.174718 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.174814 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.179858 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567012-zn55k"] Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.247346 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g49ls\" (UniqueName: \"kubernetes.io/projected/66303793-29d7-4930-a08d-2d21f4d9c1c2-kube-api-access-g49ls\") pod \"auto-csr-approver-29567012-zn55k\" (UID: \"66303793-29d7-4930-a08d-2d21f4d9c1c2\") " pod="openshift-infra/auto-csr-approver-29567012-zn55k" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.348841 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g49ls\" (UniqueName: \"kubernetes.io/projected/66303793-29d7-4930-a08d-2d21f4d9c1c2-kube-api-access-g49ls\") pod \"auto-csr-approver-29567012-zn55k\" (UID: \"66303793-29d7-4930-a08d-2d21f4d9c1c2\") " pod="openshift-infra/auto-csr-approver-29567012-zn55k" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.378218 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g49ls\" (UniqueName: \"kubernetes.io/projected/66303793-29d7-4930-a08d-2d21f4d9c1c2-kube-api-access-g49ls\") pod \"auto-csr-approver-29567012-zn55k\" (UID: \"66303793-29d7-4930-a08d-2d21f4d9c1c2\") " pod="openshift-infra/auto-csr-approver-29567012-zn55k" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.495242 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567012-zn55k" Mar 20 15:32:00 crc kubenswrapper[4764]: I0320 15:32:00.927893 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567012-zn55k"] Mar 20 15:32:01 crc kubenswrapper[4764]: I0320 15:32:01.454522 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567012-zn55k" event={"ID":"66303793-29d7-4930-a08d-2d21f4d9c1c2","Type":"ContainerStarted","Data":"21226d36385cb46747446cbd6d15ec67ae1fa40bce09aa91c67fbf584ffaae71"} Mar 20 15:32:02 crc kubenswrapper[4764]: I0320 15:32:02.480629 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567012-zn55k" event={"ID":"66303793-29d7-4930-a08d-2d21f4d9c1c2","Type":"ContainerStarted","Data":"21035a430578cd5ed62c2422fdcc5f186cac0ca1bb9f222275cbb681278ac348"} Mar 20 15:32:02 crc kubenswrapper[4764]: I0320 15:32:02.502360 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567012-zn55k" podStartSLOduration=1.348032789 podStartE2EDuration="2.50234097s" podCreationTimestamp="2026-03-20 15:32:00 +0000 UTC" firstStartedPulling="2026-03-20 15:32:00.942432273 +0000 UTC m=+2442.558621412" lastFinishedPulling="2026-03-20 15:32:02.096740464 +0000 UTC m=+2443.712929593" observedRunningTime="2026-03-20 15:32:02.494508908 +0000 UTC m=+2444.110698077" watchObservedRunningTime="2026-03-20 15:32:02.50234097 +0000 UTC m=+2444.118530109" Mar 20 15:32:03 crc kubenswrapper[4764]: I0320 15:32:03.488645 4764 generic.go:334] "Generic (PLEG): container finished" podID="66303793-29d7-4930-a08d-2d21f4d9c1c2" containerID="21035a430578cd5ed62c2422fdcc5f186cac0ca1bb9f222275cbb681278ac348" exitCode=0 Mar 20 15:32:03 crc kubenswrapper[4764]: I0320 15:32:03.488700 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567012-zn55k" event={"ID":"66303793-29d7-4930-a08d-2d21f4d9c1c2","Type":"ContainerDied","Data":"21035a430578cd5ed62c2422fdcc5f186cac0ca1bb9f222275cbb681278ac348"} Mar 20 15:32:04 crc kubenswrapper[4764]: I0320 15:32:04.975207 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567012-zn55k" Mar 20 15:32:05 crc kubenswrapper[4764]: I0320 15:32:05.054958 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g49ls\" (UniqueName: \"kubernetes.io/projected/66303793-29d7-4930-a08d-2d21f4d9c1c2-kube-api-access-g49ls\") pod \"66303793-29d7-4930-a08d-2d21f4d9c1c2\" (UID: \"66303793-29d7-4930-a08d-2d21f4d9c1c2\") " Mar 20 15:32:05 crc kubenswrapper[4764]: I0320 15:32:05.060849 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66303793-29d7-4930-a08d-2d21f4d9c1c2-kube-api-access-g49ls" (OuterVolumeSpecName: "kube-api-access-g49ls") pod "66303793-29d7-4930-a08d-2d21f4d9c1c2" (UID: "66303793-29d7-4930-a08d-2d21f4d9c1c2"). InnerVolumeSpecName "kube-api-access-g49ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:32:05 crc kubenswrapper[4764]: I0320 15:32:05.158590 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g49ls\" (UniqueName: \"kubernetes.io/projected/66303793-29d7-4930-a08d-2d21f4d9c1c2-kube-api-access-g49ls\") on node \"crc\" DevicePath \"\"" Mar 20 15:32:05 crc kubenswrapper[4764]: I0320 15:32:05.512904 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567012-zn55k" event={"ID":"66303793-29d7-4930-a08d-2d21f4d9c1c2","Type":"ContainerDied","Data":"21226d36385cb46747446cbd6d15ec67ae1fa40bce09aa91c67fbf584ffaae71"} Mar 20 15:32:05 crc kubenswrapper[4764]: I0320 15:32:05.513159 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21226d36385cb46747446cbd6d15ec67ae1fa40bce09aa91c67fbf584ffaae71" Mar 20 15:32:05 crc kubenswrapper[4764]: I0320 15:32:05.512971 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567012-zn55k" Mar 20 15:32:05 crc kubenswrapper[4764]: I0320 15:32:05.586855 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567006-xnnkt"] Mar 20 15:32:05 crc kubenswrapper[4764]: I0320 15:32:05.593841 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567006-xnnkt"] Mar 20 15:32:07 crc kubenswrapper[4764]: I0320 15:32:07.140009 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219079ac-b1c7-4b7a-8e7f-4f75e1fc2226" path="/var/lib/kubelet/pods/219079ac-b1c7-4b7a-8e7f-4f75e1fc2226/volumes" Mar 20 15:32:09 crc kubenswrapper[4764]: I0320 15:32:09.134927 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:32:09 crc kubenswrapper[4764]: E0320 15:32:09.135773 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:32:09 crc kubenswrapper[4764]: I0320 15:32:09.793342 4764 scope.go:117] "RemoveContainer" containerID="a5bf6dd5fc4e14d8d1cd9e5179a0b6c5645ba5b78b626aaf5252395c3023d2ff" Mar 20 15:32:22 crc kubenswrapper[4764]: I0320 15:32:22.126792 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:32:22 crc kubenswrapper[4764]: E0320 15:32:22.129621 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:32:36 crc kubenswrapper[4764]: I0320 15:32:36.126734 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:32:36 crc kubenswrapper[4764]: E0320 15:32:36.127660 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:32:51 crc kubenswrapper[4764]: I0320 15:32:51.126195 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:32:51 crc kubenswrapper[4764]: E0320 15:32:51.126838 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:33:04 crc kubenswrapper[4764]: I0320 15:33:04.126408 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:33:04 crc kubenswrapper[4764]: E0320 15:33:04.127427 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:33:17 crc kubenswrapper[4764]: I0320 15:33:17.126858 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:33:17 crc kubenswrapper[4764]: E0320 15:33:17.127673 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:33:29 crc kubenswrapper[4764]: I0320 15:33:29.138950 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:33:29 crc kubenswrapper[4764]: E0320 15:33:29.140032 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:33:41 crc kubenswrapper[4764]: I0320 15:33:41.125658 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:33:41 crc kubenswrapper[4764]: E0320 15:33:41.126660 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:33:55 crc kubenswrapper[4764]: I0320 15:33:55.126410 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:33:55 crc kubenswrapper[4764]: E0320 15:33:55.127233 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.148929 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567014-njff4"] Mar 20 15:34:00 crc kubenswrapper[4764]: E0320 15:34:00.149876 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66303793-29d7-4930-a08d-2d21f4d9c1c2" containerName="oc" Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.149891 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="66303793-29d7-4930-a08d-2d21f4d9c1c2" containerName="oc" Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.150125 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="66303793-29d7-4930-a08d-2d21f4d9c1c2" containerName="oc" Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.150845 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567014-njff4" Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.154810 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.155457 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.155684 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.167885 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567014-njff4"] Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.332203 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tzv7\" (UniqueName: \"kubernetes.io/projected/55005fc8-2551-443d-92ad-c102d6d1fbcd-kube-api-access-9tzv7\") pod \"auto-csr-approver-29567014-njff4\" (UID: \"55005fc8-2551-443d-92ad-c102d6d1fbcd\") " pod="openshift-infra/auto-csr-approver-29567014-njff4" Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.437349 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tzv7\" (UniqueName: \"kubernetes.io/projected/55005fc8-2551-443d-92ad-c102d6d1fbcd-kube-api-access-9tzv7\") pod \"auto-csr-approver-29567014-njff4\" (UID: \"55005fc8-2551-443d-92ad-c102d6d1fbcd\") " pod="openshift-infra/auto-csr-approver-29567014-njff4" Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.457731 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tzv7\" (UniqueName: \"kubernetes.io/projected/55005fc8-2551-443d-92ad-c102d6d1fbcd-kube-api-access-9tzv7\") pod \"auto-csr-approver-29567014-njff4\" (UID: \"55005fc8-2551-443d-92ad-c102d6d1fbcd\") " pod="openshift-infra/auto-csr-approver-29567014-njff4" Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.481903 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567014-njff4" Mar 20 15:34:00 crc kubenswrapper[4764]: I0320 15:34:00.953190 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567014-njff4"] Mar 20 15:34:01 crc kubenswrapper[4764]: I0320 15:34:01.767818 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567014-njff4" event={"ID":"55005fc8-2551-443d-92ad-c102d6d1fbcd","Type":"ContainerStarted","Data":"e7ac31f44d21061e95166ba76d349449f96e08188c6a308eaced3034e9fc856b"} Mar 20 15:34:02 crc kubenswrapper[4764]: I0320 15:34:02.801849 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567014-njff4" event={"ID":"55005fc8-2551-443d-92ad-c102d6d1fbcd","Type":"ContainerStarted","Data":"0cf8713eaaf5a7ee3aa2317b37e20068524418ead012f1b166debe90aadda9fd"} Mar 20 15:34:02 crc kubenswrapper[4764]: I0320 15:34:02.823085 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567014-njff4" podStartSLOduration=1.635005426 podStartE2EDuration="2.823065952s" podCreationTimestamp="2026-03-20 15:34:00 +0000 UTC" firstStartedPulling="2026-03-20 15:34:00.957084613 +0000 UTC m=+2562.573273732" lastFinishedPulling="2026-03-20 15:34:02.145145119 +0000 UTC m=+2563.761334258" observedRunningTime="2026-03-20 15:34:02.821034098 +0000 UTC m=+2564.437223237" watchObservedRunningTime="2026-03-20 15:34:02.823065952 +0000 UTC m=+2564.439255091" Mar 20 15:34:03 crc kubenswrapper[4764]: I0320 15:34:03.817511 4764 generic.go:334] "Generic (PLEG): container finished" podID="55005fc8-2551-443d-92ad-c102d6d1fbcd" containerID="0cf8713eaaf5a7ee3aa2317b37e20068524418ead012f1b166debe90aadda9fd" exitCode=0 Mar 20 15:34:03 crc kubenswrapper[4764]: I0320 15:34:03.817572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567014-njff4" event={"ID":"55005fc8-2551-443d-92ad-c102d6d1fbcd","Type":"ContainerDied","Data":"0cf8713eaaf5a7ee3aa2317b37e20068524418ead012f1b166debe90aadda9fd"} Mar 20 15:34:03 crc kubenswrapper[4764]: I0320 15:34:03.819644 4764 generic.go:334] "Generic (PLEG): container finished" podID="4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" containerID="9a98e954acd0cd56bbc43e32d0338cf364c6bfdb51ff639486ba65e902b6cec9" exitCode=0 Mar 20 15:34:03 crc kubenswrapper[4764]: I0320 15:34:03.819669 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" event={"ID":"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d","Type":"ContainerDied","Data":"9a98e954acd0cd56bbc43e32d0338cf364c6bfdb51ff639486ba65e902b6cec9"} Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.311807 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567014-njff4" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.317152 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.355188 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxtk5\" (UniqueName: \"kubernetes.io/projected/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-kube-api-access-rxtk5\") pod \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.355247 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-combined-ca-bundle\") pod \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.355326 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-2\") pod \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.355364 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-1\") pod \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.355404 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-migration-ssh-key-1\") pod \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.355446 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tzv7\" (UniqueName: \"kubernetes.io/projected/55005fc8-2551-443d-92ad-c102d6d1fbcd-kube-api-access-9tzv7\") pod \"55005fc8-2551-443d-92ad-c102d6d1fbcd\" (UID: \"55005fc8-2551-443d-92ad-c102d6d1fbcd\") " Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.355483 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-ssh-key-openstack-edpm-ipam\") pod \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.355538 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-migration-ssh-key-0\") pod \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.356227 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-inventory\") pod \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.356257 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-3\") pod \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.356317 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-0\") pod \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.356373 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-extra-config-0\") pod \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\" (UID: \"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d\") " Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.377965 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" (UID: "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.378234 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55005fc8-2551-443d-92ad-c102d6d1fbcd-kube-api-access-9tzv7" (OuterVolumeSpecName: "kube-api-access-9tzv7") pod "55005fc8-2551-443d-92ad-c102d6d1fbcd" (UID: "55005fc8-2551-443d-92ad-c102d6d1fbcd"). InnerVolumeSpecName "kube-api-access-9tzv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.378310 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-kube-api-access-rxtk5" (OuterVolumeSpecName: "kube-api-access-rxtk5") pod "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" (UID: "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d"). InnerVolumeSpecName "kube-api-access-rxtk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.403963 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" (UID: "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.404939 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" (UID: "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.409934 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" (UID: "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.412122 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" (UID: "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.425583 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" (UID: "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.428670 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" (UID: "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.428680 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" (UID: "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.441242 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" (UID: "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.446842 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-inventory" (OuterVolumeSpecName: "inventory") pod "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" (UID: "4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.462294 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.462349 4764 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.462371 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxtk5\" (UniqueName: \"kubernetes.io/projected/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-kube-api-access-rxtk5\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.462436 4764 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.462456 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.462477 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.462495 4764 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.462514 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tzv7\" (UniqueName: \"kubernetes.io/projected/55005fc8-2551-443d-92ad-c102d6d1fbcd-kube-api-access-9tzv7\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.462533 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.462552 4764 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.462573 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.462593 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.848767 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" event={"ID":"4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d","Type":"ContainerDied","Data":"0be69503e7fb981eb90a5b86973eb5290baaa9057600ac5585aa6b2ce30746d9"} Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.848825 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0be69503e7fb981eb90a5b86973eb5290baaa9057600ac5585aa6b2ce30746d9" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.848779 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-pxgf2" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.854055 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567014-njff4" event={"ID":"55005fc8-2551-443d-92ad-c102d6d1fbcd","Type":"ContainerDied","Data":"e7ac31f44d21061e95166ba76d349449f96e08188c6a308eaced3034e9fc856b"} Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.854143 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7ac31f44d21061e95166ba76d349449f96e08188c6a308eaced3034e9fc856b" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.854214 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567014-njff4" Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.947427 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567008-7htpr"] Mar 20 15:34:05 crc kubenswrapper[4764]: I0320 15:34:05.966505 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567008-7htpr"] Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.036094 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7"] Mar 20 15:34:06 crc kubenswrapper[4764]: E0320 15:34:06.036628 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.036664 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 15:34:06 crc kubenswrapper[4764]: E0320 15:34:06.036710 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55005fc8-2551-443d-92ad-c102d6d1fbcd" containerName="oc" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.036718 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="55005fc8-2551-443d-92ad-c102d6d1fbcd" containerName="oc" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.036949 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.036994 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="55005fc8-2551-443d-92ad-c102d6d1fbcd" containerName="oc" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.037795 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.043881 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.044100 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.044104 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.044419 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8xz9" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.047148 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7"] Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.047336 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.072352 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.072400 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.072431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4xks\" (UniqueName: \"kubernetes.io/projected/caadbad0-3673-4b77-9805-5d50cf754588-kube-api-access-k4xks\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.072494 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.072678 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.072809 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.072963 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.174948 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.175036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.175100 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.175123 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.175149 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4xks\" (UniqueName: \"kubernetes.io/projected/caadbad0-3673-4b77-9805-5d50cf754588-kube-api-access-k4xks\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.175183 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.175261 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.179450 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.179493 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.180886 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.182843 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.188010 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.188465 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.192493 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4xks\" (UniqueName: \"kubernetes.io/projected/caadbad0-3673-4b77-9805-5d50cf754588-kube-api-access-k4xks\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qftg7\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:06 crc kubenswrapper[4764]: I0320 15:34:06.367014 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:34:07 crc kubenswrapper[4764]: I0320 15:34:07.004093 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7"] Mar 20 15:34:07 crc kubenswrapper[4764]: I0320 15:34:07.144565 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03aa4307-ae38-4cb6-a30e-e42af94e2341" path="/var/lib/kubelet/pods/03aa4307-ae38-4cb6-a30e-e42af94e2341/volumes" Mar 20 15:34:07 crc kubenswrapper[4764]: I0320 15:34:07.881589 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" event={"ID":"caadbad0-3673-4b77-9805-5d50cf754588","Type":"ContainerStarted","Data":"861df25aced3e06e1329f76b2454f62af558bdf5a8805e976f6ce8f54b2fb7e9"} Mar 20 15:34:07 crc kubenswrapper[4764]: I0320 15:34:07.882125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" event={"ID":"caadbad0-3673-4b77-9805-5d50cf754588","Type":"ContainerStarted","Data":"f7a8570b181b70cc0c9ddf6618f7d94917ac761015ff81d341c33ac0b32d6157"} Mar 20 15:34:07 crc kubenswrapper[4764]: I0320 15:34:07.907942 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" podStartSLOduration=2.386202331 podStartE2EDuration="2.907924823s" podCreationTimestamp="2026-03-20 15:34:05 +0000 UTC" firstStartedPulling="2026-03-20 15:34:06.995428639 +0000 UTC m=+2568.611617808" lastFinishedPulling="2026-03-20 15:34:07.517151121 +0000 UTC m=+2569.133340300" observedRunningTime="2026-03-20 15:34:07.903827377 +0000 UTC m=+2569.520016606" watchObservedRunningTime="2026-03-20 15:34:07.907924823 +0000 UTC m=+2569.524113952" Mar 20 15:34:09 crc kubenswrapper[4764]: I0320 15:34:09.968841 4764 scope.go:117] "RemoveContainer" containerID="2a1293a0326344552b0456126d7c8e239fd2c61cc8c89171680628ab2d365db7" Mar 20 15:34:10 crc kubenswrapper[4764]: I0320 15:34:10.126642 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:34:10 crc kubenswrapper[4764]: I0320 15:34:10.919671 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"5db7e9b0f89aee2d0634b89b6f675cbb57668928d32680c0b56d0d53e1daeb6c"} Mar 20 15:36:00 crc kubenswrapper[4764]: I0320 15:36:00.151732 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567016-6qnwk"] Mar 20 15:36:00 crc kubenswrapper[4764]: I0320 15:36:00.153988 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567016-6qnwk" Mar 20 15:36:00 crc kubenswrapper[4764]: I0320 15:36:00.156143 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:36:00 crc kubenswrapper[4764]: I0320 15:36:00.156781 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:36:00 crc kubenswrapper[4764]: I0320 15:36:00.157015 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:36:00 crc kubenswrapper[4764]: I0320 15:36:00.164927 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567016-6qnwk"] Mar 20 15:36:00 crc kubenswrapper[4764]: I0320 15:36:00.322710 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt475\" (UniqueName: \"kubernetes.io/projected/e238e39c-e9e1-4d0e-b480-ce5c0fc60267-kube-api-access-lt475\") pod \"auto-csr-approver-29567016-6qnwk\" (UID: \"e238e39c-e9e1-4d0e-b480-ce5c0fc60267\") " pod="openshift-infra/auto-csr-approver-29567016-6qnwk" Mar 20 15:36:00 crc kubenswrapper[4764]: I0320 15:36:00.424021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt475\" (UniqueName: \"kubernetes.io/projected/e238e39c-e9e1-4d0e-b480-ce5c0fc60267-kube-api-access-lt475\") pod \"auto-csr-approver-29567016-6qnwk\" (UID: \"e238e39c-e9e1-4d0e-b480-ce5c0fc60267\") " pod="openshift-infra/auto-csr-approver-29567016-6qnwk" Mar 20 15:36:00 crc kubenswrapper[4764]: I0320 15:36:00.442320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt475\" (UniqueName: \"kubernetes.io/projected/e238e39c-e9e1-4d0e-b480-ce5c0fc60267-kube-api-access-lt475\") pod \"auto-csr-approver-29567016-6qnwk\" (UID: \"e238e39c-e9e1-4d0e-b480-ce5c0fc60267\") " pod="openshift-infra/auto-csr-approver-29567016-6qnwk" Mar 20 15:36:00 crc kubenswrapper[4764]: I0320 15:36:00.539703 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567016-6qnwk" Mar 20 15:36:01 crc kubenswrapper[4764]: I0320 15:36:01.001608 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567016-6qnwk"] Mar 20 15:36:01 crc kubenswrapper[4764]: I0320 15:36:01.048838 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567016-6qnwk" event={"ID":"e238e39c-e9e1-4d0e-b480-ce5c0fc60267","Type":"ContainerStarted","Data":"31ddcf2ebcce480ea620f16495645891ef2c6cb287193fb43684944be919db59"} Mar 20 15:36:03 crc kubenswrapper[4764]: I0320 15:36:03.068588 4764 generic.go:334] "Generic (PLEG): container finished" podID="e238e39c-e9e1-4d0e-b480-ce5c0fc60267" containerID="e56e079d1542cd3ba20b76720db7585aa8701fe593a6948039491a942ad82218" exitCode=0 Mar 20 15:36:03 crc kubenswrapper[4764]: I0320 15:36:03.068681 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567016-6qnwk" event={"ID":"e238e39c-e9e1-4d0e-b480-ce5c0fc60267","Type":"ContainerDied","Data":"e56e079d1542cd3ba20b76720db7585aa8701fe593a6948039491a942ad82218"} Mar 20 15:36:04 crc kubenswrapper[4764]: I0320 15:36:04.424350 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567016-6qnwk" Mar 20 15:36:04 crc kubenswrapper[4764]: I0320 15:36:04.511194 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt475\" (UniqueName: \"kubernetes.io/projected/e238e39c-e9e1-4d0e-b480-ce5c0fc60267-kube-api-access-lt475\") pod \"e238e39c-e9e1-4d0e-b480-ce5c0fc60267\" (UID: \"e238e39c-e9e1-4d0e-b480-ce5c0fc60267\") " Mar 20 15:36:04 crc kubenswrapper[4764]: I0320 15:36:04.519775 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e238e39c-e9e1-4d0e-b480-ce5c0fc60267-kube-api-access-lt475" (OuterVolumeSpecName: "kube-api-access-lt475") pod "e238e39c-e9e1-4d0e-b480-ce5c0fc60267" (UID: "e238e39c-e9e1-4d0e-b480-ce5c0fc60267"). InnerVolumeSpecName "kube-api-access-lt475". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:36:04 crc kubenswrapper[4764]: I0320 15:36:04.614037 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt475\" (UniqueName: \"kubernetes.io/projected/e238e39c-e9e1-4d0e-b480-ce5c0fc60267-kube-api-access-lt475\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:05 crc kubenswrapper[4764]: I0320 15:36:05.100029 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567016-6qnwk" event={"ID":"e238e39c-e9e1-4d0e-b480-ce5c0fc60267","Type":"ContainerDied","Data":"31ddcf2ebcce480ea620f16495645891ef2c6cb287193fb43684944be919db59"} Mar 20 15:36:05 crc kubenswrapper[4764]: I0320 15:36:05.100547 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31ddcf2ebcce480ea620f16495645891ef2c6cb287193fb43684944be919db59" Mar 20 15:36:05 crc kubenswrapper[4764]: I0320 15:36:05.100406 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567016-6qnwk" Mar 20 15:36:05 crc kubenswrapper[4764]: I0320 15:36:05.496859 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567010-x2ddf"] Mar 20 15:36:05 crc kubenswrapper[4764]: I0320 15:36:05.507814 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567010-x2ddf"] Mar 20 15:36:07 crc kubenswrapper[4764]: I0320 15:36:07.141188 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b192c7-fa74-49a4-903b-ec0fa1a86ddd" path="/var/lib/kubelet/pods/94b192c7-fa74-49a4-903b-ec0fa1a86ddd/volumes" Mar 20 15:36:10 crc kubenswrapper[4764]: I0320 15:36:10.089526 4764 scope.go:117] "RemoveContainer" containerID="86aec7e13e182c340a9d434b3877ae43a85e119549f65b24aae0ebafb5e50d7f" Mar 20 15:36:32 crc kubenswrapper[4764]: I0320 15:36:32.370646 4764 generic.go:334] "Generic (PLEG): container finished" podID="caadbad0-3673-4b77-9805-5d50cf754588" containerID="861df25aced3e06e1329f76b2454f62af558bdf5a8805e976f6ce8f54b2fb7e9" exitCode=0 Mar 20 15:36:32 crc kubenswrapper[4764]: I0320 15:36:32.371334 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" event={"ID":"caadbad0-3673-4b77-9805-5d50cf754588","Type":"ContainerDied","Data":"861df25aced3e06e1329f76b2454f62af558bdf5a8805e976f6ce8f54b2fb7e9"} Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.858543 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.922278 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-inventory\") pod \"caadbad0-3673-4b77-9805-5d50cf754588\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.922430 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4xks\" (UniqueName: \"kubernetes.io/projected/caadbad0-3673-4b77-9805-5d50cf754588-kube-api-access-k4xks\") pod \"caadbad0-3673-4b77-9805-5d50cf754588\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.922460 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ssh-key-openstack-edpm-ipam\") pod \"caadbad0-3673-4b77-9805-5d50cf754588\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.922535 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-1\") pod \"caadbad0-3673-4b77-9805-5d50cf754588\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.922600 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-2\") pod \"caadbad0-3673-4b77-9805-5d50cf754588\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.922629 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-0\") pod \"caadbad0-3673-4b77-9805-5d50cf754588\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.922689 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-telemetry-combined-ca-bundle\") pod \"caadbad0-3673-4b77-9805-5d50cf754588\" (UID: \"caadbad0-3673-4b77-9805-5d50cf754588\") " Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.939347 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caadbad0-3673-4b77-9805-5d50cf754588-kube-api-access-k4xks" (OuterVolumeSpecName: "kube-api-access-k4xks") pod "caadbad0-3673-4b77-9805-5d50cf754588" (UID: "caadbad0-3673-4b77-9805-5d50cf754588"). InnerVolumeSpecName "kube-api-access-k4xks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.946080 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "caadbad0-3673-4b77-9805-5d50cf754588" (UID: "caadbad0-3673-4b77-9805-5d50cf754588"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.950056 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "caadbad0-3673-4b77-9805-5d50cf754588" (UID: "caadbad0-3673-4b77-9805-5d50cf754588"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.953041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-inventory" (OuterVolumeSpecName: "inventory") pod "caadbad0-3673-4b77-9805-5d50cf754588" (UID: "caadbad0-3673-4b77-9805-5d50cf754588"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.953167 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "caadbad0-3673-4b77-9805-5d50cf754588" (UID: "caadbad0-3673-4b77-9805-5d50cf754588"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.955359 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "caadbad0-3673-4b77-9805-5d50cf754588" (UID: "caadbad0-3673-4b77-9805-5d50cf754588"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:36:33 crc kubenswrapper[4764]: I0320 15:36:33.993048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "caadbad0-3673-4b77-9805-5d50cf754588" (UID: "caadbad0-3673-4b77-9805-5d50cf754588"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:36:34 crc kubenswrapper[4764]: I0320 15:36:34.025759 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4xks\" (UniqueName: \"kubernetes.io/projected/caadbad0-3673-4b77-9805-5d50cf754588-kube-api-access-k4xks\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:34 crc kubenswrapper[4764]: I0320 15:36:34.025809 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:34 crc kubenswrapper[4764]: I0320 15:36:34.025825 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:34 crc kubenswrapper[4764]: I0320 15:36:34.025840 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:34 crc kubenswrapper[4764]: I0320 15:36:34.025854 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:34 crc kubenswrapper[4764]: I0320 15:36:34.025868 4764 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:34 crc kubenswrapper[4764]: I0320 15:36:34.025881 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/caadbad0-3673-4b77-9805-5d50cf754588-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:34 crc kubenswrapper[4764]: I0320 15:36:34.390107 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" event={"ID":"caadbad0-3673-4b77-9805-5d50cf754588","Type":"ContainerDied","Data":"f7a8570b181b70cc0c9ddf6618f7d94917ac761015ff81d341c33ac0b32d6157"} Mar 20 15:36:34 crc kubenswrapper[4764]: I0320 15:36:34.390155 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7a8570b181b70cc0c9ddf6618f7d94917ac761015ff81d341c33ac0b32d6157" Mar 20 15:36:34 crc kubenswrapper[4764]: I0320 15:36:34.390192 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qftg7" Mar 20 15:36:38 crc kubenswrapper[4764]: I0320 15:36:38.446857 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:36:38 crc kubenswrapper[4764]: I0320 15:36:38.447550 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.429940 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dgmhm"] Mar 20 15:36:40 crc kubenswrapper[4764]: E0320 15:36:40.430673 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e238e39c-e9e1-4d0e-b480-ce5c0fc60267" containerName="oc" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.430689 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e238e39c-e9e1-4d0e-b480-ce5c0fc60267" containerName="oc" Mar 20 15:36:40 crc kubenswrapper[4764]: E0320 15:36:40.430726 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caadbad0-3673-4b77-9805-5d50cf754588" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.430736 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="caadbad0-3673-4b77-9805-5d50cf754588" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.431071 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e238e39c-e9e1-4d0e-b480-ce5c0fc60267" containerName="oc" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.431097 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="caadbad0-3673-4b77-9805-5d50cf754588" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.432952 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.456695 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dgmhm"] Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.515916 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08130516-b0c8-42f4-a874-31f86dc56a0c-utilities\") pod \"community-operators-dgmhm\" (UID: \"08130516-b0c8-42f4-a874-31f86dc56a0c\") " pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.515992 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7hd6\" (UniqueName: \"kubernetes.io/projected/08130516-b0c8-42f4-a874-31f86dc56a0c-kube-api-access-c7hd6\") pod \"community-operators-dgmhm\" (UID: \"08130516-b0c8-42f4-a874-31f86dc56a0c\") " pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.516024 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08130516-b0c8-42f4-a874-31f86dc56a0c-catalog-content\") pod \"community-operators-dgmhm\" (UID: \"08130516-b0c8-42f4-a874-31f86dc56a0c\") " pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.617505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08130516-b0c8-42f4-a874-31f86dc56a0c-utilities\") pod \"community-operators-dgmhm\" (UID: \"08130516-b0c8-42f4-a874-31f86dc56a0c\") " pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.617588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7hd6\" (UniqueName: \"kubernetes.io/projected/08130516-b0c8-42f4-a874-31f86dc56a0c-kube-api-access-c7hd6\") pod \"community-operators-dgmhm\" (UID: \"08130516-b0c8-42f4-a874-31f86dc56a0c\") " pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.617632 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08130516-b0c8-42f4-a874-31f86dc56a0c-catalog-content\") pod \"community-operators-dgmhm\" (UID: \"08130516-b0c8-42f4-a874-31f86dc56a0c\") " pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.618133 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08130516-b0c8-42f4-a874-31f86dc56a0c-utilities\") pod \"community-operators-dgmhm\" (UID: \"08130516-b0c8-42f4-a874-31f86dc56a0c\") " pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.621210 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08130516-b0c8-42f4-a874-31f86dc56a0c-catalog-content\") pod \"community-operators-dgmhm\" (UID: \"08130516-b0c8-42f4-a874-31f86dc56a0c\") " pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.650714 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7hd6\" (UniqueName: \"kubernetes.io/projected/08130516-b0c8-42f4-a874-31f86dc56a0c-kube-api-access-c7hd6\") pod \"community-operators-dgmhm\" (UID: \"08130516-b0c8-42f4-a874-31f86dc56a0c\") " pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:40 crc kubenswrapper[4764]: I0320 15:36:40.751956 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:41 crc kubenswrapper[4764]: I0320 15:36:41.282949 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dgmhm"] Mar 20 15:36:41 crc kubenswrapper[4764]: I0320 15:36:41.494441 4764 generic.go:334] "Generic (PLEG): container finished" podID="08130516-b0c8-42f4-a874-31f86dc56a0c" containerID="1767054a2090011350080d7bb6d66b1a04cfc2016fa87b7a359b18505182a21b" exitCode=0 Mar 20 15:36:41 crc kubenswrapper[4764]: I0320 15:36:41.494481 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dgmhm" event={"ID":"08130516-b0c8-42f4-a874-31f86dc56a0c","Type":"ContainerDied","Data":"1767054a2090011350080d7bb6d66b1a04cfc2016fa87b7a359b18505182a21b"} Mar 20 15:36:41 crc kubenswrapper[4764]: I0320 15:36:41.494507 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dgmhm" event={"ID":"08130516-b0c8-42f4-a874-31f86dc56a0c","Type":"ContainerStarted","Data":"6bfa77a2498115068e25b31cf4bff5ae17156fb2f82418ce06ea930f5f60e22b"} Mar 20 15:36:43 crc kubenswrapper[4764]: I0320 15:36:43.523852 4764 generic.go:334] "Generic (PLEG): container finished" podID="08130516-b0c8-42f4-a874-31f86dc56a0c" containerID="ff3dfbcd1e66cbf20e52fb4573ae1d6ca7274016c81510ac6ff176ede6345320" exitCode=0 Mar 20 15:36:43 crc kubenswrapper[4764]: I0320 15:36:43.523962 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dgmhm" event={"ID":"08130516-b0c8-42f4-a874-31f86dc56a0c","Type":"ContainerDied","Data":"ff3dfbcd1e66cbf20e52fb4573ae1d6ca7274016c81510ac6ff176ede6345320"} Mar 20 15:36:44 crc kubenswrapper[4764]: I0320 15:36:44.536279 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dgmhm" event={"ID":"08130516-b0c8-42f4-a874-31f86dc56a0c","Type":"ContainerStarted","Data":"f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84"} Mar 20 15:36:44 crc kubenswrapper[4764]: I0320 15:36:44.561470 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dgmhm" podStartSLOduration=2.088728397 podStartE2EDuration="4.561454072s" podCreationTimestamp="2026-03-20 15:36:40 +0000 UTC" firstStartedPulling="2026-03-20 15:36:41.496938134 +0000 UTC m=+2723.113127263" lastFinishedPulling="2026-03-20 15:36:43.969663799 +0000 UTC m=+2725.585852938" observedRunningTime="2026-03-20 15:36:44.561217735 +0000 UTC m=+2726.177406874" watchObservedRunningTime="2026-03-20 15:36:44.561454072 +0000 UTC m=+2726.177643201" Mar 20 15:36:50 crc kubenswrapper[4764]: I0320 15:36:50.752588 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:50 crc kubenswrapper[4764]: I0320 15:36:50.753140 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:50 crc kubenswrapper[4764]: I0320 15:36:50.815969 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:51 crc kubenswrapper[4764]: I0320 15:36:51.681261 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:51 crc kubenswrapper[4764]: I0320 15:36:51.745152 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dgmhm"] Mar 20 15:36:53 crc kubenswrapper[4764]: I0320 15:36:53.620583 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dgmhm" podUID="08130516-b0c8-42f4-a874-31f86dc56a0c" containerName="registry-server" containerID="cri-o://f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84" gracePeriod=2 Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.596895 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.647609 4764 generic.go:334] "Generic (PLEG): container finished" podID="08130516-b0c8-42f4-a874-31f86dc56a0c" containerID="f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84" exitCode=0 Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.647679 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dgmhm" event={"ID":"08130516-b0c8-42f4-a874-31f86dc56a0c","Type":"ContainerDied","Data":"f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84"} Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.648108 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dgmhm" event={"ID":"08130516-b0c8-42f4-a874-31f86dc56a0c","Type":"ContainerDied","Data":"6bfa77a2498115068e25b31cf4bff5ae17156fb2f82418ce06ea930f5f60e22b"} Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.648170 4764 scope.go:117] "RemoveContainer" containerID="f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.647731 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dgmhm" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.680250 4764 scope.go:117] "RemoveContainer" containerID="ff3dfbcd1e66cbf20e52fb4573ae1d6ca7274016c81510ac6ff176ede6345320" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.716774 4764 scope.go:117] "RemoveContainer" containerID="1767054a2090011350080d7bb6d66b1a04cfc2016fa87b7a359b18505182a21b" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.750076 4764 scope.go:117] "RemoveContainer" containerID="f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84" Mar 20 15:36:54 crc kubenswrapper[4764]: E0320 15:36:54.750575 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84\": container with ID starting with f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84 not found: ID does not exist" containerID="f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.750738 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84"} err="failed to get container status \"f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84\": rpc error: code = NotFound desc = could not find container \"f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84\": container with ID starting with f78efb79b073ea36a9c669d78c8cec33e13f6ec259a29d1f6c5c5d4490315f84 not found: ID does not exist" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.750831 4764 scope.go:117] "RemoveContainer" containerID="ff3dfbcd1e66cbf20e52fb4573ae1d6ca7274016c81510ac6ff176ede6345320" Mar 20 15:36:54 crc kubenswrapper[4764]: E0320 15:36:54.751260 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff3dfbcd1e66cbf20e52fb4573ae1d6ca7274016c81510ac6ff176ede6345320\": container with ID starting with ff3dfbcd1e66cbf20e52fb4573ae1d6ca7274016c81510ac6ff176ede6345320 not found: ID does not exist" containerID="ff3dfbcd1e66cbf20e52fb4573ae1d6ca7274016c81510ac6ff176ede6345320" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.751349 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff3dfbcd1e66cbf20e52fb4573ae1d6ca7274016c81510ac6ff176ede6345320"} err="failed to get container status \"ff3dfbcd1e66cbf20e52fb4573ae1d6ca7274016c81510ac6ff176ede6345320\": rpc error: code = NotFound desc = could not find container \"ff3dfbcd1e66cbf20e52fb4573ae1d6ca7274016c81510ac6ff176ede6345320\": container with ID starting with ff3dfbcd1e66cbf20e52fb4573ae1d6ca7274016c81510ac6ff176ede6345320 not found: ID does not exist" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.751437 4764 scope.go:117] "RemoveContainer" containerID="1767054a2090011350080d7bb6d66b1a04cfc2016fa87b7a359b18505182a21b" Mar 20 15:36:54 crc kubenswrapper[4764]: E0320 15:36:54.751916 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1767054a2090011350080d7bb6d66b1a04cfc2016fa87b7a359b18505182a21b\": container with ID starting with 1767054a2090011350080d7bb6d66b1a04cfc2016fa87b7a359b18505182a21b not found: ID does not exist" containerID="1767054a2090011350080d7bb6d66b1a04cfc2016fa87b7a359b18505182a21b" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.752010 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1767054a2090011350080d7bb6d66b1a04cfc2016fa87b7a359b18505182a21b"} err="failed to get container status \"1767054a2090011350080d7bb6d66b1a04cfc2016fa87b7a359b18505182a21b\": rpc error: code = NotFound desc = could not find container \"1767054a2090011350080d7bb6d66b1a04cfc2016fa87b7a359b18505182a21b\": container with ID starting with 1767054a2090011350080d7bb6d66b1a04cfc2016fa87b7a359b18505182a21b not found: ID does not exist" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.781819 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7hd6\" (UniqueName: \"kubernetes.io/projected/08130516-b0c8-42f4-a874-31f86dc56a0c-kube-api-access-c7hd6\") pod \"08130516-b0c8-42f4-a874-31f86dc56a0c\" (UID: \"08130516-b0c8-42f4-a874-31f86dc56a0c\") " Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.781978 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08130516-b0c8-42f4-a874-31f86dc56a0c-utilities\") pod \"08130516-b0c8-42f4-a874-31f86dc56a0c\" (UID: \"08130516-b0c8-42f4-a874-31f86dc56a0c\") " Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.782135 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08130516-b0c8-42f4-a874-31f86dc56a0c-catalog-content\") pod \"08130516-b0c8-42f4-a874-31f86dc56a0c\" (UID: \"08130516-b0c8-42f4-a874-31f86dc56a0c\") " Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.783711 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08130516-b0c8-42f4-a874-31f86dc56a0c-utilities" (OuterVolumeSpecName: "utilities") pod "08130516-b0c8-42f4-a874-31f86dc56a0c" (UID: "08130516-b0c8-42f4-a874-31f86dc56a0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.792895 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08130516-b0c8-42f4-a874-31f86dc56a0c-kube-api-access-c7hd6" (OuterVolumeSpecName: "kube-api-access-c7hd6") pod "08130516-b0c8-42f4-a874-31f86dc56a0c" (UID: "08130516-b0c8-42f4-a874-31f86dc56a0c"). InnerVolumeSpecName "kube-api-access-c7hd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.837018 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08130516-b0c8-42f4-a874-31f86dc56a0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08130516-b0c8-42f4-a874-31f86dc56a0c" (UID: "08130516-b0c8-42f4-a874-31f86dc56a0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.884528 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7hd6\" (UniqueName: \"kubernetes.io/projected/08130516-b0c8-42f4-a874-31f86dc56a0c-kube-api-access-c7hd6\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.884558 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08130516-b0c8-42f4-a874-31f86dc56a0c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.884569 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08130516-b0c8-42f4-a874-31f86dc56a0c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.981903 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dgmhm"] Mar 20 15:36:54 crc kubenswrapper[4764]: I0320 15:36:54.994102 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dgmhm"] Mar 20 15:36:55 crc kubenswrapper[4764]: I0320 15:36:55.139101 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08130516-b0c8-42f4-a874-31f86dc56a0c" path="/var/lib/kubelet/pods/08130516-b0c8-42f4-a874-31f86dc56a0c/volumes" Mar 20 15:37:02 crc kubenswrapper[4764]: I0320 15:37:02.831160 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n765m"] Mar 20 15:37:02 crc kubenswrapper[4764]: E0320 15:37:02.832103 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08130516-b0c8-42f4-a874-31f86dc56a0c" containerName="registry-server" Mar 20 15:37:02 crc kubenswrapper[4764]: I0320 15:37:02.832142 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="08130516-b0c8-42f4-a874-31f86dc56a0c" containerName="registry-server" Mar 20 15:37:02 crc kubenswrapper[4764]: E0320 15:37:02.832166 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08130516-b0c8-42f4-a874-31f86dc56a0c" containerName="extract-utilities" Mar 20 15:37:02 crc kubenswrapper[4764]: I0320 15:37:02.832173 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="08130516-b0c8-42f4-a874-31f86dc56a0c" containerName="extract-utilities" Mar 20 15:37:02 crc kubenswrapper[4764]: E0320 15:37:02.832188 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08130516-b0c8-42f4-a874-31f86dc56a0c" containerName="extract-content" Mar 20 15:37:02 crc kubenswrapper[4764]: I0320 15:37:02.832195 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="08130516-b0c8-42f4-a874-31f86dc56a0c" containerName="extract-content" Mar 20 15:37:02 crc kubenswrapper[4764]: I0320 15:37:02.832400 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="08130516-b0c8-42f4-a874-31f86dc56a0c" containerName="registry-server" Mar 20 15:37:02 crc kubenswrapper[4764]: I0320 15:37:02.833896 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:02 crc kubenswrapper[4764]: I0320 15:37:02.843409 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n765m"] Mar 20 15:37:02 crc kubenswrapper[4764]: I0320 15:37:02.968369 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88dace42-c5ee-449c-9b31-095ae122df16-utilities\") pod \"redhat-marketplace-n765m\" (UID: \"88dace42-c5ee-449c-9b31-095ae122df16\") " pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:02 crc kubenswrapper[4764]: I0320 15:37:02.968454 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88dace42-c5ee-449c-9b31-095ae122df16-catalog-content\") pod \"redhat-marketplace-n765m\" (UID: \"88dace42-c5ee-449c-9b31-095ae122df16\") " pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:02 crc kubenswrapper[4764]: I0320 15:37:02.969075 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2bmn\" (UniqueName: \"kubernetes.io/projected/88dace42-c5ee-449c-9b31-095ae122df16-kube-api-access-w2bmn\") pod \"redhat-marketplace-n765m\" (UID: \"88dace42-c5ee-449c-9b31-095ae122df16\") " pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:03 crc kubenswrapper[4764]: I0320 15:37:03.071419 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88dace42-c5ee-449c-9b31-095ae122df16-utilities\") pod \"redhat-marketplace-n765m\" (UID: \"88dace42-c5ee-449c-9b31-095ae122df16\") " pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:03 crc kubenswrapper[4764]: I0320 15:37:03.071476 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88dace42-c5ee-449c-9b31-095ae122df16-catalog-content\") pod \"redhat-marketplace-n765m\" (UID: \"88dace42-c5ee-449c-9b31-095ae122df16\") " pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:03 crc kubenswrapper[4764]: I0320 15:37:03.071496 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2bmn\" (UniqueName: \"kubernetes.io/projected/88dace42-c5ee-449c-9b31-095ae122df16-kube-api-access-w2bmn\") pod \"redhat-marketplace-n765m\" (UID: \"88dace42-c5ee-449c-9b31-095ae122df16\") " pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:03 crc kubenswrapper[4764]: I0320 15:37:03.072268 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88dace42-c5ee-449c-9b31-095ae122df16-utilities\") pod \"redhat-marketplace-n765m\" (UID: \"88dace42-c5ee-449c-9b31-095ae122df16\") " pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:03 crc kubenswrapper[4764]: I0320 15:37:03.072322 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88dace42-c5ee-449c-9b31-095ae122df16-catalog-content\") pod \"redhat-marketplace-n765m\" (UID: \"88dace42-c5ee-449c-9b31-095ae122df16\") " pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:03 crc kubenswrapper[4764]: I0320 15:37:03.093269 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2bmn\" (UniqueName: \"kubernetes.io/projected/88dace42-c5ee-449c-9b31-095ae122df16-kube-api-access-w2bmn\") pod \"redhat-marketplace-n765m\" (UID: \"88dace42-c5ee-449c-9b31-095ae122df16\") " pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:03 crc kubenswrapper[4764]: I0320 15:37:03.153790 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:03 crc kubenswrapper[4764]: I0320 15:37:03.649184 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n765m"] Mar 20 15:37:03 crc kubenswrapper[4764]: I0320 15:37:03.731741 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n765m" event={"ID":"88dace42-c5ee-449c-9b31-095ae122df16","Type":"ContainerStarted","Data":"b7f8b01089e6488c77dbc072d5cfdde79eb6a9f6f18bbd0d37fdb1ed6b37fc92"} Mar 20 15:37:04 crc kubenswrapper[4764]: I0320 15:37:04.741207 4764 generic.go:334] "Generic (PLEG): container finished" podID="88dace42-c5ee-449c-9b31-095ae122df16" containerID="e7df4aad1e9425ee01ea61af59d429b795896f777d98b6dbf74e0971c8354ddc" exitCode=0 Mar 20 15:37:04 crc kubenswrapper[4764]: I0320 15:37:04.741265 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n765m" event={"ID":"88dace42-c5ee-449c-9b31-095ae122df16","Type":"ContainerDied","Data":"e7df4aad1e9425ee01ea61af59d429b795896f777d98b6dbf74e0971c8354ddc"} Mar 20 15:37:04 crc kubenswrapper[4764]: I0320 15:37:04.743144 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:37:05 crc kubenswrapper[4764]: I0320 15:37:05.753916 4764 generic.go:334] "Generic (PLEG): container finished" podID="88dace42-c5ee-449c-9b31-095ae122df16" containerID="7772fa0ea7efd7fa0f0e6c84a49ee66facbbea98bc4f0b7f6307a457a97fd22a" exitCode=0 Mar 20 15:37:05 crc kubenswrapper[4764]: I0320 15:37:05.754024 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n765m" event={"ID":"88dace42-c5ee-449c-9b31-095ae122df16","Type":"ContainerDied","Data":"7772fa0ea7efd7fa0f0e6c84a49ee66facbbea98bc4f0b7f6307a457a97fd22a"} Mar 20 15:37:06 crc kubenswrapper[4764]: I0320 15:37:06.766753 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n765m" event={"ID":"88dace42-c5ee-449c-9b31-095ae122df16","Type":"ContainerStarted","Data":"39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002"} Mar 20 15:37:06 crc kubenswrapper[4764]: I0320 15:37:06.792666 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n765m" podStartSLOduration=3.324564492 podStartE2EDuration="4.792637478s" podCreationTimestamp="2026-03-20 15:37:02 +0000 UTC" firstStartedPulling="2026-03-20 15:37:04.742897403 +0000 UTC m=+2746.359086532" lastFinishedPulling="2026-03-20 15:37:06.210970349 +0000 UTC m=+2747.827159518" observedRunningTime="2026-03-20 15:37:06.786844958 +0000 UTC m=+2748.403034097" watchObservedRunningTime="2026-03-20 15:37:06.792637478 +0000 UTC m=+2748.408826637" Mar 20 15:37:08 crc kubenswrapper[4764]: I0320 15:37:08.443621 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:37:08 crc kubenswrapper[4764]: I0320 15:37:08.443670 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:37:13 crc kubenswrapper[4764]: I0320 15:37:13.154488 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:13 crc kubenswrapper[4764]: I0320 15:37:13.155084 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:13 crc kubenswrapper[4764]: I0320 15:37:13.231746 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:13 crc kubenswrapper[4764]: I0320 15:37:13.930806 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:13 crc kubenswrapper[4764]: I0320 15:37:13.993492 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n765m"] Mar 20 15:37:15 crc kubenswrapper[4764]: I0320 15:37:15.872011 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n765m" podUID="88dace42-c5ee-449c-9b31-095ae122df16" containerName="registry-server" containerID="cri-o://39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002" gracePeriod=2 Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.403593 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.529485 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88dace42-c5ee-449c-9b31-095ae122df16-utilities\") pod \"88dace42-c5ee-449c-9b31-095ae122df16\" (UID: \"88dace42-c5ee-449c-9b31-095ae122df16\") " Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.529613 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2bmn\" (UniqueName: \"kubernetes.io/projected/88dace42-c5ee-449c-9b31-095ae122df16-kube-api-access-w2bmn\") pod \"88dace42-c5ee-449c-9b31-095ae122df16\" (UID: \"88dace42-c5ee-449c-9b31-095ae122df16\") " Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.529910 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88dace42-c5ee-449c-9b31-095ae122df16-catalog-content\") pod \"88dace42-c5ee-449c-9b31-095ae122df16\" (UID: \"88dace42-c5ee-449c-9b31-095ae122df16\") " Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.530312 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88dace42-c5ee-449c-9b31-095ae122df16-utilities" (OuterVolumeSpecName: "utilities") pod "88dace42-c5ee-449c-9b31-095ae122df16" (UID: "88dace42-c5ee-449c-9b31-095ae122df16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.530444 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88dace42-c5ee-449c-9b31-095ae122df16-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.535915 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88dace42-c5ee-449c-9b31-095ae122df16-kube-api-access-w2bmn" (OuterVolumeSpecName: "kube-api-access-w2bmn") pod "88dace42-c5ee-449c-9b31-095ae122df16" (UID: "88dace42-c5ee-449c-9b31-095ae122df16"). InnerVolumeSpecName "kube-api-access-w2bmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.568797 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88dace42-c5ee-449c-9b31-095ae122df16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88dace42-c5ee-449c-9b31-095ae122df16" (UID: "88dace42-c5ee-449c-9b31-095ae122df16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.632154 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88dace42-c5ee-449c-9b31-095ae122df16-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.632867 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2bmn\" (UniqueName: \"kubernetes.io/projected/88dace42-c5ee-449c-9b31-095ae122df16-kube-api-access-w2bmn\") on node \"crc\" DevicePath \"\"" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.882426 4764 generic.go:334] "Generic (PLEG): container finished" podID="88dace42-c5ee-449c-9b31-095ae122df16" containerID="39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002" exitCode=0 Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.882476 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n765m" event={"ID":"88dace42-c5ee-449c-9b31-095ae122df16","Type":"ContainerDied","Data":"39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002"} Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.882510 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n765m" event={"ID":"88dace42-c5ee-449c-9b31-095ae122df16","Type":"ContainerDied","Data":"b7f8b01089e6488c77dbc072d5cfdde79eb6a9f6f18bbd0d37fdb1ed6b37fc92"} Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.882532 4764 scope.go:117] "RemoveContainer" containerID="39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.882591 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n765m" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.917018 4764 scope.go:117] "RemoveContainer" containerID="7772fa0ea7efd7fa0f0e6c84a49ee66facbbea98bc4f0b7f6307a457a97fd22a" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.925595 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n765m"] Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.941601 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n765m"] Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.945257 4764 scope.go:117] "RemoveContainer" containerID="e7df4aad1e9425ee01ea61af59d429b795896f777d98b6dbf74e0971c8354ddc" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.982845 4764 scope.go:117] "RemoveContainer" containerID="39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002" Mar 20 15:37:16 crc kubenswrapper[4764]: E0320 15:37:16.983487 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002\": container with ID starting with 39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002 not found: ID does not exist" containerID="39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.983535 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002"} err="failed to get container status \"39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002\": rpc error: code = NotFound desc = could not find container \"39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002\": container with ID starting with 39483ea2eb43bcf8230c616e69b8e4efa04d8a4fad8931a4edd4f8a3f1878002 not found: ID does not exist" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.983565 4764 scope.go:117] "RemoveContainer" containerID="7772fa0ea7efd7fa0f0e6c84a49ee66facbbea98bc4f0b7f6307a457a97fd22a" Mar 20 15:37:16 crc kubenswrapper[4764]: E0320 15:37:16.983956 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7772fa0ea7efd7fa0f0e6c84a49ee66facbbea98bc4f0b7f6307a457a97fd22a\": container with ID starting with 7772fa0ea7efd7fa0f0e6c84a49ee66facbbea98bc4f0b7f6307a457a97fd22a not found: ID does not exist" containerID="7772fa0ea7efd7fa0f0e6c84a49ee66facbbea98bc4f0b7f6307a457a97fd22a" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.984061 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7772fa0ea7efd7fa0f0e6c84a49ee66facbbea98bc4f0b7f6307a457a97fd22a"} err="failed to get container status \"7772fa0ea7efd7fa0f0e6c84a49ee66facbbea98bc4f0b7f6307a457a97fd22a\": rpc error: code = NotFound desc = could not find container \"7772fa0ea7efd7fa0f0e6c84a49ee66facbbea98bc4f0b7f6307a457a97fd22a\": container with ID starting with 7772fa0ea7efd7fa0f0e6c84a49ee66facbbea98bc4f0b7f6307a457a97fd22a not found: ID does not exist" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.984154 4764 scope.go:117] "RemoveContainer" containerID="e7df4aad1e9425ee01ea61af59d429b795896f777d98b6dbf74e0971c8354ddc" Mar 20 15:37:16 crc kubenswrapper[4764]: E0320 15:37:16.984502 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7df4aad1e9425ee01ea61af59d429b795896f777d98b6dbf74e0971c8354ddc\": container with ID starting with e7df4aad1e9425ee01ea61af59d429b795896f777d98b6dbf74e0971c8354ddc not found: ID does not exist" containerID="e7df4aad1e9425ee01ea61af59d429b795896f777d98b6dbf74e0971c8354ddc" Mar 20 15:37:16 crc kubenswrapper[4764]: I0320 15:37:16.984530 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7df4aad1e9425ee01ea61af59d429b795896f777d98b6dbf74e0971c8354ddc"} err="failed to get container status \"e7df4aad1e9425ee01ea61af59d429b795896f777d98b6dbf74e0971c8354ddc\": rpc error: code = NotFound desc = could not find container \"e7df4aad1e9425ee01ea61af59d429b795896f777d98b6dbf74e0971c8354ddc\": container with ID starting with e7df4aad1e9425ee01ea61af59d429b795896f777d98b6dbf74e0971c8354ddc not found: ID does not exist" Mar 20 15:37:17 crc kubenswrapper[4764]: I0320 15:37:17.139933 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88dace42-c5ee-449c-9b31-095ae122df16" path="/var/lib/kubelet/pods/88dace42-c5ee-449c-9b31-095ae122df16/volumes" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.583788 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 15:37:26 crc kubenswrapper[4764]: E0320 15:37:26.584887 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88dace42-c5ee-449c-9b31-095ae122df16" containerName="extract-content" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.584907 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dace42-c5ee-449c-9b31-095ae122df16" containerName="extract-content" Mar 20 15:37:26 crc kubenswrapper[4764]: E0320 15:37:26.584922 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88dace42-c5ee-449c-9b31-095ae122df16" containerName="extract-utilities" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.584930 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dace42-c5ee-449c-9b31-095ae122df16" containerName="extract-utilities" Mar 20 15:37:26 crc kubenswrapper[4764]: E0320 15:37:26.584966 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88dace42-c5ee-449c-9b31-095ae122df16" containerName="registry-server" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.584973 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dace42-c5ee-449c-9b31-095ae122df16" containerName="registry-server" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.585169 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="88dace42-c5ee-449c-9b31-095ae122df16" containerName="registry-server" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.585930 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.588745 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.588862 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.588915 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-b2dfb" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.591121 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.601358 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.759160 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.759619 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.759735 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.759809 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2f991298-5b9e-4568-b8b0-24d9d1978a6d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.759906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2f991298-5b9e-4568-b8b0-24d9d1978a6d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.759945 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.759972 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2f991298-5b9e-4568-b8b0-24d9d1978a6d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.760142 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f991298-5b9e-4568-b8b0-24d9d1978a6d-config-data\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.760188 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cjp6\" (UniqueName: \"kubernetes.io/projected/2f991298-5b9e-4568-b8b0-24d9d1978a6d-kube-api-access-7cjp6\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.862847 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.863357 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.863410 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.863457 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.863605 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2f991298-5b9e-4568-b8b0-24d9d1978a6d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.863781 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2f991298-5b9e-4568-b8b0-24d9d1978a6d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.863887 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.864003 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2f991298-5b9e-4568-b8b0-24d9d1978a6d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.864177 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f991298-5b9e-4568-b8b0-24d9d1978a6d-config-data\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.864263 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cjp6\" (UniqueName: \"kubernetes.io/projected/2f991298-5b9e-4568-b8b0-24d9d1978a6d-kube-api-access-7cjp6\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.864437 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2f991298-5b9e-4568-b8b0-24d9d1978a6d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.866959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2f991298-5b9e-4568-b8b0-24d9d1978a6d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.868238 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f991298-5b9e-4568-b8b0-24d9d1978a6d-config-data\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.870562 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.870606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2f991298-5b9e-4568-b8b0-24d9d1978a6d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.870972 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.872725 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.884270 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cjp6\" (UniqueName: \"kubernetes.io/projected/2f991298-5b9e-4568-b8b0-24d9d1978a6d-kube-api-access-7cjp6\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.907629 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " pod="openstack/tempest-tests-tempest" Mar 20 15:37:26 crc kubenswrapper[4764]: I0320 15:37:26.929225 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 15:37:27 crc kubenswrapper[4764]: I0320 15:37:27.583190 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 15:37:27 crc kubenswrapper[4764]: I0320 15:37:27.997555 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2f991298-5b9e-4568-b8b0-24d9d1978a6d","Type":"ContainerStarted","Data":"faef668f6730dd27ada870308aa434e91755c8950748d509c212ee2fa844b095"} Mar 20 15:37:38 crc kubenswrapper[4764]: I0320 15:37:38.443483 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:37:38 crc kubenswrapper[4764]: I0320 15:37:38.444320 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:37:38 crc kubenswrapper[4764]: I0320 15:37:38.444397 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 15:37:38 crc kubenswrapper[4764]: I0320 15:37:38.444964 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5db7e9b0f89aee2d0634b89b6f675cbb57668928d32680c0b56d0d53e1daeb6c"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:37:38 crc kubenswrapper[4764]: I0320 15:37:38.445046 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://5db7e9b0f89aee2d0634b89b6f675cbb57668928d32680c0b56d0d53e1daeb6c" gracePeriod=600 Mar 20 15:37:39 crc kubenswrapper[4764]: I0320 15:37:39.116232 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="5db7e9b0f89aee2d0634b89b6f675cbb57668928d32680c0b56d0d53e1daeb6c" exitCode=0 Mar 20 15:37:39 crc kubenswrapper[4764]: I0320 15:37:39.116312 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"5db7e9b0f89aee2d0634b89b6f675cbb57668928d32680c0b56d0d53e1daeb6c"} Mar 20 15:37:39 crc kubenswrapper[4764]: I0320 15:37:39.116616 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3"} Mar 20 15:37:39 crc kubenswrapper[4764]: I0320 15:37:39.116637 4764 scope.go:117] "RemoveContainer" containerID="30ae17d4d752f8bf2b5d58b6280019f6fa251e53b22e5c19f483f598fe52e6c5" Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.312287 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8zf7q"] Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.314743 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.328838 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8zf7q"] Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.400104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-catalog-content\") pod \"redhat-operators-8zf7q\" (UID: \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\") " pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.400453 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h4sm\" (UniqueName: \"kubernetes.io/projected/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-kube-api-access-2h4sm\") pod \"redhat-operators-8zf7q\" (UID: \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\") " pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.400557 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-utilities\") pod \"redhat-operators-8zf7q\" (UID: \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\") " pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.501748 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-utilities\") pod \"redhat-operators-8zf7q\" (UID: \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\") " pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.502192 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-utilities\") pod \"redhat-operators-8zf7q\" (UID: \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\") " pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.502329 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-catalog-content\") pod \"redhat-operators-8zf7q\" (UID: \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\") " pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.502711 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-catalog-content\") pod \"redhat-operators-8zf7q\" (UID: \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\") " pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.502882 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h4sm\" (UniqueName: \"kubernetes.io/projected/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-kube-api-access-2h4sm\") pod \"redhat-operators-8zf7q\" (UID: \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\") " pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.524508 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h4sm\" (UniqueName: \"kubernetes.io/projected/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-kube-api-access-2h4sm\") pod \"redhat-operators-8zf7q\" (UID: \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\") " pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:37:40 crc kubenswrapper[4764]: I0320 15:37:40.641839 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:38:00 crc kubenswrapper[4764]: I0320 15:38:00.143972 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567018-dzfdx"] Mar 20 15:38:00 crc kubenswrapper[4764]: I0320 15:38:00.146017 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567018-dzfdx" Mar 20 15:38:00 crc kubenswrapper[4764]: I0320 15:38:00.157627 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567018-dzfdx"] Mar 20 15:38:00 crc kubenswrapper[4764]: I0320 15:38:00.157961 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:38:00 crc kubenswrapper[4764]: I0320 15:38:00.161606 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:38:00 crc kubenswrapper[4764]: I0320 15:38:00.161938 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:38:00 crc kubenswrapper[4764]: I0320 15:38:00.318508 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xv86\" (UniqueName: \"kubernetes.io/projected/c03568f2-d1ff-4c61-914a-65c9db1c5a43-kube-api-access-6xv86\") pod \"auto-csr-approver-29567018-dzfdx\" (UID: \"c03568f2-d1ff-4c61-914a-65c9db1c5a43\") " pod="openshift-infra/auto-csr-approver-29567018-dzfdx" Mar 20 15:38:00 crc kubenswrapper[4764]: I0320 15:38:00.420478 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xv86\" (UniqueName: \"kubernetes.io/projected/c03568f2-d1ff-4c61-914a-65c9db1c5a43-kube-api-access-6xv86\") pod \"auto-csr-approver-29567018-dzfdx\" (UID: \"c03568f2-d1ff-4c61-914a-65c9db1c5a43\") " pod="openshift-infra/auto-csr-approver-29567018-dzfdx" Mar 20 15:38:00 crc kubenswrapper[4764]: I0320 15:38:00.457811 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xv86\" (UniqueName: \"kubernetes.io/projected/c03568f2-d1ff-4c61-914a-65c9db1c5a43-kube-api-access-6xv86\") pod \"auto-csr-approver-29567018-dzfdx\" (UID: \"c03568f2-d1ff-4c61-914a-65c9db1c5a43\") " pod="openshift-infra/auto-csr-approver-29567018-dzfdx" Mar 20 15:38:00 crc kubenswrapper[4764]: I0320 15:38:00.479898 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567018-dzfdx" Mar 20 15:38:02 crc kubenswrapper[4764]: E0320 15:38:02.207737 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 20 15:38:02 crc kubenswrapper[4764]: E0320 15:38:02.208276 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7cjp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(2f991298-5b9e-4568-b8b0-24d9d1978a6d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 15:38:02 crc kubenswrapper[4764]: E0320 15:38:02.209542 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="2f991298-5b9e-4568-b8b0-24d9d1978a6d" Mar 20 15:38:02 crc kubenswrapper[4764]: E0320 15:38:02.342639 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="2f991298-5b9e-4568-b8b0-24d9d1978a6d" Mar 20 15:38:02 crc kubenswrapper[4764]: I0320 15:38:02.635779 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567018-dzfdx"] Mar 20 15:38:02 crc kubenswrapper[4764]: I0320 15:38:02.692347 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8zf7q"] Mar 20 15:38:03 crc kubenswrapper[4764]: I0320 15:38:03.348831 4764 generic.go:334] "Generic (PLEG): container finished" podID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" containerID="ccacc84506cd372f2a8161474c18e09a8fa9d7c6329546e0abf6de3e2855bdb8" exitCode=0 Mar 20 15:38:03 crc kubenswrapper[4764]: I0320 15:38:03.348921 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zf7q" event={"ID":"330fc37d-7efc-4dea-a1c1-1a2b63cbc321","Type":"ContainerDied","Data":"ccacc84506cd372f2a8161474c18e09a8fa9d7c6329546e0abf6de3e2855bdb8"} Mar 20 15:38:03 crc kubenswrapper[4764]: I0320 15:38:03.348963 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zf7q" event={"ID":"330fc37d-7efc-4dea-a1c1-1a2b63cbc321","Type":"ContainerStarted","Data":"240be003bf591cbc0192b7646efbbb8f928845a5a0d8b256ab686c150748dd56"} Mar 20 15:38:03 crc kubenswrapper[4764]: I0320 15:38:03.350438 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567018-dzfdx" event={"ID":"c03568f2-d1ff-4c61-914a-65c9db1c5a43","Type":"ContainerStarted","Data":"2705cf62c69de98ee5250aaffa6c3c3dd2b138f873f54ca7a826130de648c182"} Mar 20 15:38:04 crc kubenswrapper[4764]: I0320 15:38:04.367312 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zf7q" event={"ID":"330fc37d-7efc-4dea-a1c1-1a2b63cbc321","Type":"ContainerStarted","Data":"4cf0103d529bbff0ca3778bff17f756e2fae790051573d8f3959097874515a60"} Mar 20 15:38:04 crc kubenswrapper[4764]: I0320 15:38:04.371151 4764 generic.go:334] "Generic (PLEG): container finished" podID="c03568f2-d1ff-4c61-914a-65c9db1c5a43" containerID="365c7468fd6f1b3d1b051ca9dfbe0c3e7b2404ed6a5e0154d3c20f9fb4055643" exitCode=0 Mar 20 15:38:04 crc kubenswrapper[4764]: I0320 15:38:04.371228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567018-dzfdx" event={"ID":"c03568f2-d1ff-4c61-914a-65c9db1c5a43","Type":"ContainerDied","Data":"365c7468fd6f1b3d1b051ca9dfbe0c3e7b2404ed6a5e0154d3c20f9fb4055643"} Mar 20 15:38:05 crc kubenswrapper[4764]: I0320 15:38:05.381960 4764 generic.go:334] "Generic (PLEG): container finished" podID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" containerID="4cf0103d529bbff0ca3778bff17f756e2fae790051573d8f3959097874515a60" exitCode=0 Mar 20 15:38:05 crc kubenswrapper[4764]: I0320 15:38:05.382178 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zf7q" event={"ID":"330fc37d-7efc-4dea-a1c1-1a2b63cbc321","Type":"ContainerDied","Data":"4cf0103d529bbff0ca3778bff17f756e2fae790051573d8f3959097874515a60"} Mar 20 15:38:05 crc kubenswrapper[4764]: I0320 15:38:05.788445 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567018-dzfdx" Mar 20 15:38:05 crc kubenswrapper[4764]: I0320 15:38:05.933822 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xv86\" (UniqueName: \"kubernetes.io/projected/c03568f2-d1ff-4c61-914a-65c9db1c5a43-kube-api-access-6xv86\") pod \"c03568f2-d1ff-4c61-914a-65c9db1c5a43\" (UID: \"c03568f2-d1ff-4c61-914a-65c9db1c5a43\") " Mar 20 15:38:05 crc kubenswrapper[4764]: I0320 15:38:05.942708 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03568f2-d1ff-4c61-914a-65c9db1c5a43-kube-api-access-6xv86" (OuterVolumeSpecName: "kube-api-access-6xv86") pod "c03568f2-d1ff-4c61-914a-65c9db1c5a43" (UID: "c03568f2-d1ff-4c61-914a-65c9db1c5a43"). InnerVolumeSpecName "kube-api-access-6xv86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:38:06 crc kubenswrapper[4764]: I0320 15:38:06.036345 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xv86\" (UniqueName: \"kubernetes.io/projected/c03568f2-d1ff-4c61-914a-65c9db1c5a43-kube-api-access-6xv86\") on node \"crc\" DevicePath \"\"" Mar 20 15:38:06 crc kubenswrapper[4764]: I0320 15:38:06.396714 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zf7q" event={"ID":"330fc37d-7efc-4dea-a1c1-1a2b63cbc321","Type":"ContainerStarted","Data":"61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935"} Mar 20 15:38:06 crc kubenswrapper[4764]: I0320 15:38:06.399043 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567018-dzfdx" event={"ID":"c03568f2-d1ff-4c61-914a-65c9db1c5a43","Type":"ContainerDied","Data":"2705cf62c69de98ee5250aaffa6c3c3dd2b138f873f54ca7a826130de648c182"} Mar 20 15:38:06 crc kubenswrapper[4764]: I0320 15:38:06.399080 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2705cf62c69de98ee5250aaffa6c3c3dd2b138f873f54ca7a826130de648c182" Mar 20 15:38:06 crc kubenswrapper[4764]: I0320 15:38:06.399079 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567018-dzfdx" Mar 20 15:38:06 crc kubenswrapper[4764]: I0320 15:38:06.434492 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8zf7q" podStartSLOduration=24.028430235 podStartE2EDuration="26.434475561s" podCreationTimestamp="2026-03-20 15:37:40 +0000 UTC" firstStartedPulling="2026-03-20 15:38:03.351313919 +0000 UTC m=+2804.967503038" lastFinishedPulling="2026-03-20 15:38:05.757359235 +0000 UTC m=+2807.373548364" observedRunningTime="2026-03-20 15:38:06.432013325 +0000 UTC m=+2808.048202484" watchObservedRunningTime="2026-03-20 15:38:06.434475561 +0000 UTC m=+2808.050664690" Mar 20 15:38:06 crc kubenswrapper[4764]: I0320 15:38:06.879768 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567012-zn55k"] Mar 20 15:38:06 crc kubenswrapper[4764]: I0320 15:38:06.890837 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567012-zn55k"] Mar 20 15:38:07 crc kubenswrapper[4764]: I0320 15:38:07.138483 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66303793-29d7-4930-a08d-2d21f4d9c1c2" path="/var/lib/kubelet/pods/66303793-29d7-4930-a08d-2d21f4d9c1c2/volumes" Mar 20 15:38:10 crc kubenswrapper[4764]: I0320 15:38:10.221960 4764 scope.go:117] "RemoveContainer" containerID="21035a430578cd5ed62c2422fdcc5f186cac0ca1bb9f222275cbb681278ac348" Mar 20 15:38:10 crc kubenswrapper[4764]: I0320 15:38:10.642109 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:38:10 crc kubenswrapper[4764]: I0320 15:38:10.642157 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:38:11 crc kubenswrapper[4764]: I0320 15:38:11.720710 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8zf7q" podUID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" containerName="registry-server" probeResult="failure" output=< Mar 20 15:38:11 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 20 15:38:11 crc kubenswrapper[4764]: > Mar 20 15:38:15 crc kubenswrapper[4764]: I0320 15:38:15.490586 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2f991298-5b9e-4568-b8b0-24d9d1978a6d","Type":"ContainerStarted","Data":"058c9a488281413fee8bc893a180ce8831d7178f57e7eb01f953e98e602d3b66"} Mar 20 15:38:15 crc kubenswrapper[4764]: I0320 15:38:15.531247 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.330196488 podStartE2EDuration="50.531217852s" podCreationTimestamp="2026-03-20 15:37:25 +0000 UTC" firstStartedPulling="2026-03-20 15:37:27.579311669 +0000 UTC m=+2769.195500798" lastFinishedPulling="2026-03-20 15:38:13.780333033 +0000 UTC m=+2815.396522162" observedRunningTime="2026-03-20 15:38:15.52111062 +0000 UTC m=+2817.137299749" watchObservedRunningTime="2026-03-20 15:38:15.531217852 +0000 UTC m=+2817.147407021" Mar 20 15:38:20 crc kubenswrapper[4764]: I0320 15:38:20.687000 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:38:20 crc kubenswrapper[4764]: I0320 15:38:20.751297 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:38:20 crc kubenswrapper[4764]: I0320 15:38:20.935418 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8zf7q"] Mar 20 15:38:22 crc kubenswrapper[4764]: I0320 15:38:22.571736 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8zf7q" podUID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" containerName="registry-server" containerID="cri-o://61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935" gracePeriod=2 Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.079129 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.200273 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-catalog-content\") pod \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\" (UID: \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\") " Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.200691 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-utilities\") pod \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\" (UID: \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\") " Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.200737 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h4sm\" (UniqueName: \"kubernetes.io/projected/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-kube-api-access-2h4sm\") pod \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\" (UID: \"330fc37d-7efc-4dea-a1c1-1a2b63cbc321\") " Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.201845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-utilities" (OuterVolumeSpecName: "utilities") pod "330fc37d-7efc-4dea-a1c1-1a2b63cbc321" (UID: "330fc37d-7efc-4dea-a1c1-1a2b63cbc321"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.203265 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.212914 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-kube-api-access-2h4sm" (OuterVolumeSpecName: "kube-api-access-2h4sm") pod "330fc37d-7efc-4dea-a1c1-1a2b63cbc321" (UID: "330fc37d-7efc-4dea-a1c1-1a2b63cbc321"). InnerVolumeSpecName "kube-api-access-2h4sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.305104 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h4sm\" (UniqueName: \"kubernetes.io/projected/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-kube-api-access-2h4sm\") on node \"crc\" DevicePath \"\"" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.349073 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "330fc37d-7efc-4dea-a1c1-1a2b63cbc321" (UID: "330fc37d-7efc-4dea-a1c1-1a2b63cbc321"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.406485 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330fc37d-7efc-4dea-a1c1-1a2b63cbc321-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.584545 4764 generic.go:334] "Generic (PLEG): container finished" podID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" containerID="61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935" exitCode=0 Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.584589 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zf7q" event={"ID":"330fc37d-7efc-4dea-a1c1-1a2b63cbc321","Type":"ContainerDied","Data":"61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935"} Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.584614 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zf7q" event={"ID":"330fc37d-7efc-4dea-a1c1-1a2b63cbc321","Type":"ContainerDied","Data":"240be003bf591cbc0192b7646efbbb8f928845a5a0d8b256ab686c150748dd56"} Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.584630 4764 scope.go:117] "RemoveContainer" containerID="61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.584753 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8zf7q" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.631586 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8zf7q"] Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.631701 4764 scope.go:117] "RemoveContainer" containerID="4cf0103d529bbff0ca3778bff17f756e2fae790051573d8f3959097874515a60" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.652710 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8zf7q"] Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.667685 4764 scope.go:117] "RemoveContainer" containerID="ccacc84506cd372f2a8161474c18e09a8fa9d7c6329546e0abf6de3e2855bdb8" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.711611 4764 scope.go:117] "RemoveContainer" containerID="61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935" Mar 20 15:38:23 crc kubenswrapper[4764]: E0320 15:38:23.712224 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935\": container with ID starting with 61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935 not found: ID does not exist" containerID="61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.712265 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935"} err="failed to get container status \"61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935\": rpc error: code = NotFound desc = could not find container \"61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935\": container with ID starting with 61ca676cb015b663880aed45f473fd680185b2968ef3fb6c6f7d5b5a5fd84935 not found: ID does not exist" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.712286 4764 scope.go:117] "RemoveContainer" containerID="4cf0103d529bbff0ca3778bff17f756e2fae790051573d8f3959097874515a60" Mar 20 15:38:23 crc kubenswrapper[4764]: E0320 15:38:23.712741 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cf0103d529bbff0ca3778bff17f756e2fae790051573d8f3959097874515a60\": container with ID starting with 4cf0103d529bbff0ca3778bff17f756e2fae790051573d8f3959097874515a60 not found: ID does not exist" containerID="4cf0103d529bbff0ca3778bff17f756e2fae790051573d8f3959097874515a60" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.712762 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf0103d529bbff0ca3778bff17f756e2fae790051573d8f3959097874515a60"} err="failed to get container status \"4cf0103d529bbff0ca3778bff17f756e2fae790051573d8f3959097874515a60\": rpc error: code = NotFound desc = could not find container \"4cf0103d529bbff0ca3778bff17f756e2fae790051573d8f3959097874515a60\": container with ID starting with 4cf0103d529bbff0ca3778bff17f756e2fae790051573d8f3959097874515a60 not found: ID does not exist" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.712776 4764 scope.go:117] "RemoveContainer" containerID="ccacc84506cd372f2a8161474c18e09a8fa9d7c6329546e0abf6de3e2855bdb8" Mar 20 15:38:23 crc kubenswrapper[4764]: E0320 15:38:23.713010 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccacc84506cd372f2a8161474c18e09a8fa9d7c6329546e0abf6de3e2855bdb8\": container with ID starting with ccacc84506cd372f2a8161474c18e09a8fa9d7c6329546e0abf6de3e2855bdb8 not found: ID does not exist" containerID="ccacc84506cd372f2a8161474c18e09a8fa9d7c6329546e0abf6de3e2855bdb8" Mar 20 15:38:23 crc kubenswrapper[4764]: I0320 15:38:23.713026 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccacc84506cd372f2a8161474c18e09a8fa9d7c6329546e0abf6de3e2855bdb8"} err="failed to get container status \"ccacc84506cd372f2a8161474c18e09a8fa9d7c6329546e0abf6de3e2855bdb8\": rpc error: code = NotFound desc = could not find container \"ccacc84506cd372f2a8161474c18e09a8fa9d7c6329546e0abf6de3e2855bdb8\": container with ID starting with ccacc84506cd372f2a8161474c18e09a8fa9d7c6329546e0abf6de3e2855bdb8 not found: ID does not exist" Mar 20 15:38:25 crc kubenswrapper[4764]: I0320 15:38:25.138878 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" path="/var/lib/kubelet/pods/330fc37d-7efc-4dea-a1c1-1a2b63cbc321/volumes" Mar 20 15:39:38 crc kubenswrapper[4764]: I0320 15:39:38.443657 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:39:38 crc kubenswrapper[4764]: I0320 15:39:38.444526 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.149525 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567020-65xzc"] Mar 20 15:40:00 crc kubenswrapper[4764]: E0320 15:40:00.150650 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" containerName="extract-content" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.150667 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" containerName="extract-content" Mar 20 15:40:00 crc kubenswrapper[4764]: E0320 15:40:00.150684 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03568f2-d1ff-4c61-914a-65c9db1c5a43" containerName="oc" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.150690 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03568f2-d1ff-4c61-914a-65c9db1c5a43" containerName="oc" Mar 20 15:40:00 crc kubenswrapper[4764]: E0320 15:40:00.150715 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" containerName="registry-server" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.150721 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" containerName="registry-server" Mar 20 15:40:00 crc kubenswrapper[4764]: E0320 15:40:00.150732 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" containerName="extract-utilities" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.150740 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" containerName="extract-utilities" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.150928 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="330fc37d-7efc-4dea-a1c1-1a2b63cbc321" containerName="registry-server" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.150948 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03568f2-d1ff-4c61-914a-65c9db1c5a43" containerName="oc" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.151665 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567020-65xzc" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.153701 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.153720 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.154137 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.162669 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567020-65xzc"] Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.328491 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvz59\" (UniqueName: \"kubernetes.io/projected/2ae95724-7309-46a9-9301-6c2a922fc880-kube-api-access-jvz59\") pod \"auto-csr-approver-29567020-65xzc\" (UID: \"2ae95724-7309-46a9-9301-6c2a922fc880\") " pod="openshift-infra/auto-csr-approver-29567020-65xzc" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.431444 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvz59\" (UniqueName: \"kubernetes.io/projected/2ae95724-7309-46a9-9301-6c2a922fc880-kube-api-access-jvz59\") pod \"auto-csr-approver-29567020-65xzc\" (UID: \"2ae95724-7309-46a9-9301-6c2a922fc880\") " pod="openshift-infra/auto-csr-approver-29567020-65xzc" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.456928 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvz59\" (UniqueName: \"kubernetes.io/projected/2ae95724-7309-46a9-9301-6c2a922fc880-kube-api-access-jvz59\") pod \"auto-csr-approver-29567020-65xzc\" (UID: \"2ae95724-7309-46a9-9301-6c2a922fc880\") " pod="openshift-infra/auto-csr-approver-29567020-65xzc" Mar 20 15:40:00 crc kubenswrapper[4764]: I0320 15:40:00.478865 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567020-65xzc" Mar 20 15:40:01 crc kubenswrapper[4764]: I0320 15:40:01.069508 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567020-65xzc"] Mar 20 15:40:01 crc kubenswrapper[4764]: I0320 15:40:01.475903 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567020-65xzc" event={"ID":"2ae95724-7309-46a9-9301-6c2a922fc880","Type":"ContainerStarted","Data":"d2fac28dbcdaed913b755428472315e604d301894196fe17fdf4baa901140c9a"} Mar 20 15:40:03 crc kubenswrapper[4764]: I0320 15:40:03.496496 4764 generic.go:334] "Generic (PLEG): container finished" podID="2ae95724-7309-46a9-9301-6c2a922fc880" containerID="18b43d8fc974921fc9801467e86580163a9dfb570ffc264417ce62e331d6cf01" exitCode=0 Mar 20 15:40:03 crc kubenswrapper[4764]: I0320 15:40:03.497050 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567020-65xzc" event={"ID":"2ae95724-7309-46a9-9301-6c2a922fc880","Type":"ContainerDied","Data":"18b43d8fc974921fc9801467e86580163a9dfb570ffc264417ce62e331d6cf01"} Mar 20 15:40:05 crc kubenswrapper[4764]: I0320 15:40:05.053917 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567020-65xzc" Mar 20 15:40:05 crc kubenswrapper[4764]: I0320 15:40:05.249936 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvz59\" (UniqueName: \"kubernetes.io/projected/2ae95724-7309-46a9-9301-6c2a922fc880-kube-api-access-jvz59\") pod \"2ae95724-7309-46a9-9301-6c2a922fc880\" (UID: \"2ae95724-7309-46a9-9301-6c2a922fc880\") " Mar 20 15:40:05 crc kubenswrapper[4764]: I0320 15:40:05.269973 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae95724-7309-46a9-9301-6c2a922fc880-kube-api-access-jvz59" (OuterVolumeSpecName: "kube-api-access-jvz59") pod "2ae95724-7309-46a9-9301-6c2a922fc880" (UID: "2ae95724-7309-46a9-9301-6c2a922fc880"). InnerVolumeSpecName "kube-api-access-jvz59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:40:05 crc kubenswrapper[4764]: I0320 15:40:05.352281 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvz59\" (UniqueName: \"kubernetes.io/projected/2ae95724-7309-46a9-9301-6c2a922fc880-kube-api-access-jvz59\") on node \"crc\" DevicePath \"\"" Mar 20 15:40:05 crc kubenswrapper[4764]: I0320 15:40:05.513516 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567020-65xzc" event={"ID":"2ae95724-7309-46a9-9301-6c2a922fc880","Type":"ContainerDied","Data":"d2fac28dbcdaed913b755428472315e604d301894196fe17fdf4baa901140c9a"} Mar 20 15:40:05 crc kubenswrapper[4764]: I0320 15:40:05.513564 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2fac28dbcdaed913b755428472315e604d301894196fe17fdf4baa901140c9a" Mar 20 15:40:05 crc kubenswrapper[4764]: I0320 15:40:05.513857 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567020-65xzc" Mar 20 15:40:06 crc kubenswrapper[4764]: I0320 15:40:06.118907 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567014-njff4"] Mar 20 15:40:06 crc kubenswrapper[4764]: I0320 15:40:06.130235 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567014-njff4"] Mar 20 15:40:07 crc kubenswrapper[4764]: I0320 15:40:07.137980 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55005fc8-2551-443d-92ad-c102d6d1fbcd" path="/var/lib/kubelet/pods/55005fc8-2551-443d-92ad-c102d6d1fbcd/volumes" Mar 20 15:40:08 crc kubenswrapper[4764]: I0320 15:40:08.443688 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:40:08 crc kubenswrapper[4764]: I0320 15:40:08.444222 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:40:10 crc kubenswrapper[4764]: I0320 15:40:10.447132 4764 scope.go:117] "RemoveContainer" containerID="0cf8713eaaf5a7ee3aa2317b37e20068524418ead012f1b166debe90aadda9fd" Mar 20 15:40:38 crc kubenswrapper[4764]: I0320 15:40:38.443763 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:40:38 crc kubenswrapper[4764]: I0320 15:40:38.444498 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:40:38 crc kubenswrapper[4764]: I0320 15:40:38.444560 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 15:40:38 crc kubenswrapper[4764]: I0320 15:40:38.445416 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:40:38 crc kubenswrapper[4764]: I0320 15:40:38.445486 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" gracePeriod=600 Mar 20 15:40:38 crc kubenswrapper[4764]: E0320 15:40:38.573582 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:40:38 crc kubenswrapper[4764]: I0320 15:40:38.821002 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" exitCode=0 Mar 20 15:40:38 crc kubenswrapper[4764]: I0320 15:40:38.821114 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3"} Mar 20 15:40:38 crc kubenswrapper[4764]: I0320 15:40:38.821442 4764 scope.go:117] "RemoveContainer" containerID="5db7e9b0f89aee2d0634b89b6f675cbb57668928d32680c0b56d0d53e1daeb6c" Mar 20 15:40:38 crc kubenswrapper[4764]: I0320 15:40:38.822195 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:40:38 crc kubenswrapper[4764]: E0320 15:40:38.822503 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:40:51 crc kubenswrapper[4764]: I0320 15:40:51.126090 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:40:51 crc kubenswrapper[4764]: E0320 15:40:51.126776 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:41:05 crc kubenswrapper[4764]: I0320 15:41:05.131288 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:41:05 crc kubenswrapper[4764]: E0320 15:41:05.131962 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:41:20 crc kubenswrapper[4764]: I0320 15:41:20.126631 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:41:20 crc kubenswrapper[4764]: E0320 15:41:20.127604 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:41:33 crc kubenswrapper[4764]: I0320 15:41:33.125833 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:41:33 crc kubenswrapper[4764]: E0320 15:41:33.127318 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:41:44 crc kubenswrapper[4764]: I0320 15:41:44.127175 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:41:44 crc kubenswrapper[4764]: E0320 15:41:44.127778 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:41:57 crc kubenswrapper[4764]: I0320 15:41:57.126773 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:41:57 crc kubenswrapper[4764]: E0320 15:41:57.127899 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:42:00 crc kubenswrapper[4764]: I0320 15:42:00.151676 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567022-9b9mf"] Mar 20 15:42:00 crc kubenswrapper[4764]: E0320 15:42:00.153099 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae95724-7309-46a9-9301-6c2a922fc880" containerName="oc" Mar 20 15:42:00 crc kubenswrapper[4764]: I0320 15:42:00.153131 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae95724-7309-46a9-9301-6c2a922fc880" containerName="oc" Mar 20 15:42:00 crc kubenswrapper[4764]: I0320 15:42:00.153574 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae95724-7309-46a9-9301-6c2a922fc880" containerName="oc" Mar 20 15:42:00 crc kubenswrapper[4764]: I0320 15:42:00.154595 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567022-9b9mf" Mar 20 15:42:00 crc kubenswrapper[4764]: I0320 15:42:00.158619 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:42:00 crc kubenswrapper[4764]: I0320 15:42:00.158924 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:42:00 crc kubenswrapper[4764]: I0320 15:42:00.159036 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:42:00 crc kubenswrapper[4764]: I0320 15:42:00.161849 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567022-9b9mf"] Mar 20 15:42:00 crc kubenswrapper[4764]: I0320 15:42:00.358375 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v87r\" (UniqueName: \"kubernetes.io/projected/91f6364c-5c64-4314-81c2-3477d10a069a-kube-api-access-4v87r\") pod \"auto-csr-approver-29567022-9b9mf\" (UID: \"91f6364c-5c64-4314-81c2-3477d10a069a\") " pod="openshift-infra/auto-csr-approver-29567022-9b9mf" Mar 20 15:42:00 crc kubenswrapper[4764]: I0320 15:42:00.460779 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v87r\" (UniqueName: \"kubernetes.io/projected/91f6364c-5c64-4314-81c2-3477d10a069a-kube-api-access-4v87r\") pod \"auto-csr-approver-29567022-9b9mf\" (UID: \"91f6364c-5c64-4314-81c2-3477d10a069a\") " pod="openshift-infra/auto-csr-approver-29567022-9b9mf" Mar 20 15:42:00 crc kubenswrapper[4764]: I0320 15:42:00.493052 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v87r\" (UniqueName: \"kubernetes.io/projected/91f6364c-5c64-4314-81c2-3477d10a069a-kube-api-access-4v87r\") pod \"auto-csr-approver-29567022-9b9mf\" (UID: \"91f6364c-5c64-4314-81c2-3477d10a069a\") " pod="openshift-infra/auto-csr-approver-29567022-9b9mf" Mar 20 15:42:00 crc kubenswrapper[4764]: I0320 15:42:00.784523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567022-9b9mf" Mar 20 15:42:01 crc kubenswrapper[4764]: I0320 15:42:01.311393 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567022-9b9mf"] Mar 20 15:42:01 crc kubenswrapper[4764]: I0320 15:42:01.530858 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567022-9b9mf" event={"ID":"91f6364c-5c64-4314-81c2-3477d10a069a","Type":"ContainerStarted","Data":"447f7b59560dfa0f551cdae85c1bae3e9fcb22ca110abcb6321f4898f14a55f9"} Mar 20 15:42:03 crc kubenswrapper[4764]: I0320 15:42:03.556874 4764 generic.go:334] "Generic (PLEG): container finished" podID="91f6364c-5c64-4314-81c2-3477d10a069a" containerID="d971b1c7671a18c61e2a60af840d12c1b5e75d283f3bf012087d89e76e319f85" exitCode=0 Mar 20 15:42:03 crc kubenswrapper[4764]: I0320 15:42:03.556916 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567022-9b9mf" event={"ID":"91f6364c-5c64-4314-81c2-3477d10a069a","Type":"ContainerDied","Data":"d971b1c7671a18c61e2a60af840d12c1b5e75d283f3bf012087d89e76e319f85"} Mar 20 15:42:05 crc kubenswrapper[4764]: I0320 15:42:05.122532 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567022-9b9mf" Mar 20 15:42:05 crc kubenswrapper[4764]: I0320 15:42:05.290405 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v87r\" (UniqueName: \"kubernetes.io/projected/91f6364c-5c64-4314-81c2-3477d10a069a-kube-api-access-4v87r\") pod \"91f6364c-5c64-4314-81c2-3477d10a069a\" (UID: \"91f6364c-5c64-4314-81c2-3477d10a069a\") " Mar 20 15:42:05 crc kubenswrapper[4764]: I0320 15:42:05.299568 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91f6364c-5c64-4314-81c2-3477d10a069a-kube-api-access-4v87r" (OuterVolumeSpecName: "kube-api-access-4v87r") pod "91f6364c-5c64-4314-81c2-3477d10a069a" (UID: "91f6364c-5c64-4314-81c2-3477d10a069a"). InnerVolumeSpecName "kube-api-access-4v87r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:42:05 crc kubenswrapper[4764]: I0320 15:42:05.392801 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v87r\" (UniqueName: \"kubernetes.io/projected/91f6364c-5c64-4314-81c2-3477d10a069a-kube-api-access-4v87r\") on node \"crc\" DevicePath \"\"" Mar 20 15:42:05 crc kubenswrapper[4764]: I0320 15:42:05.576131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567022-9b9mf" event={"ID":"91f6364c-5c64-4314-81c2-3477d10a069a","Type":"ContainerDied","Data":"447f7b59560dfa0f551cdae85c1bae3e9fcb22ca110abcb6321f4898f14a55f9"} Mar 20 15:42:05 crc kubenswrapper[4764]: I0320 15:42:05.576174 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="447f7b59560dfa0f551cdae85c1bae3e9fcb22ca110abcb6321f4898f14a55f9" Mar 20 15:42:05 crc kubenswrapper[4764]: I0320 15:42:05.576238 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567022-9b9mf" Mar 20 15:42:06 crc kubenswrapper[4764]: I0320 15:42:06.220045 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567016-6qnwk"] Mar 20 15:42:06 crc kubenswrapper[4764]: I0320 15:42:06.231752 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567016-6qnwk"] Mar 20 15:42:07 crc kubenswrapper[4764]: I0320 15:42:07.137115 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e238e39c-e9e1-4d0e-b480-ce5c0fc60267" path="/var/lib/kubelet/pods/e238e39c-e9e1-4d0e-b480-ce5c0fc60267/volumes" Mar 20 15:42:10 crc kubenswrapper[4764]: I0320 15:42:10.126310 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:42:10 crc kubenswrapper[4764]: E0320 15:42:10.126644 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:42:10 crc kubenswrapper[4764]: I0320 15:42:10.536679 4764 scope.go:117] "RemoveContainer" containerID="e56e079d1542cd3ba20b76720db7585aa8701fe593a6948039491a942ad82218" Mar 20 15:42:21 crc kubenswrapper[4764]: I0320 15:42:21.127357 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:42:21 crc kubenswrapper[4764]: E0320 15:42:21.128095 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:42:34 crc kubenswrapper[4764]: I0320 15:42:34.126486 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:42:34 crc kubenswrapper[4764]: E0320 15:42:34.128167 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:42:39 crc kubenswrapper[4764]: I0320 15:42:39.957606 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vsdjr"] Mar 20 15:42:39 crc kubenswrapper[4764]: E0320 15:42:39.958376 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f6364c-5c64-4314-81c2-3477d10a069a" containerName="oc" Mar 20 15:42:39 crc kubenswrapper[4764]: I0320 15:42:39.958401 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f6364c-5c64-4314-81c2-3477d10a069a" containerName="oc" Mar 20 15:42:39 crc kubenswrapper[4764]: I0320 15:42:39.958589 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f6364c-5c64-4314-81c2-3477d10a069a" containerName="oc" Mar 20 15:42:39 crc kubenswrapper[4764]: I0320 15:42:39.959788 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:39 crc kubenswrapper[4764]: I0320 15:42:39.979051 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsdjr"] Mar 20 15:42:40 crc kubenswrapper[4764]: I0320 15:42:40.078031 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-utilities\") pod \"certified-operators-vsdjr\" (UID: \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\") " pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:40 crc kubenswrapper[4764]: I0320 15:42:40.078348 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm7zd\" (UniqueName: \"kubernetes.io/projected/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-kube-api-access-pm7zd\") pod \"certified-operators-vsdjr\" (UID: \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\") " pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:40 crc kubenswrapper[4764]: I0320 15:42:40.078561 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-catalog-content\") pod \"certified-operators-vsdjr\" (UID: \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\") " pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:40 crc kubenswrapper[4764]: I0320 15:42:40.180357 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-utilities\") pod \"certified-operators-vsdjr\" (UID: \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\") " pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:40 crc kubenswrapper[4764]: I0320 15:42:40.180497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm7zd\" (UniqueName: \"kubernetes.io/projected/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-kube-api-access-pm7zd\") pod \"certified-operators-vsdjr\" (UID: \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\") " pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:40 crc kubenswrapper[4764]: I0320 15:42:40.180575 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-catalog-content\") pod \"certified-operators-vsdjr\" (UID: \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\") " pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:40 crc kubenswrapper[4764]: I0320 15:42:40.180886 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-utilities\") pod \"certified-operators-vsdjr\" (UID: \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\") " pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:40 crc kubenswrapper[4764]: I0320 15:42:40.181019 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-catalog-content\") pod \"certified-operators-vsdjr\" (UID: \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\") " pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:40 crc kubenswrapper[4764]: I0320 15:42:40.221374 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm7zd\" (UniqueName: \"kubernetes.io/projected/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-kube-api-access-pm7zd\") pod \"certified-operators-vsdjr\" (UID: \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\") " pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:40 crc kubenswrapper[4764]: I0320 15:42:40.324591 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:40 crc kubenswrapper[4764]: I0320 15:42:40.823558 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsdjr"] Mar 20 15:42:40 crc kubenswrapper[4764]: I0320 15:42:40.933553 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsdjr" event={"ID":"2069ec9d-5dc2-4a95-8254-16d6214a7e5b","Type":"ContainerStarted","Data":"78ad24164689cfbd55d4f635e07aca6e58b48a5f5d94dfe63b1db2b3e5a4aaf8"} Mar 20 15:42:41 crc kubenswrapper[4764]: I0320 15:42:41.944083 4764 generic.go:334] "Generic (PLEG): container finished" podID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" containerID="06aa94f76635b533809c120842042aa574de528dca9c250e407b19ec9fc40677" exitCode=0 Mar 20 15:42:41 crc kubenswrapper[4764]: I0320 15:42:41.944141 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsdjr" event={"ID":"2069ec9d-5dc2-4a95-8254-16d6214a7e5b","Type":"ContainerDied","Data":"06aa94f76635b533809c120842042aa574de528dca9c250e407b19ec9fc40677"} Mar 20 15:42:41 crc kubenswrapper[4764]: I0320 15:42:41.945988 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:42:43 crc kubenswrapper[4764]: I0320 15:42:43.963272 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsdjr" event={"ID":"2069ec9d-5dc2-4a95-8254-16d6214a7e5b","Type":"ContainerStarted","Data":"88be549125ea1dab6c6cb25e2b2c7ae01f4ca7910f64d83e87e8df8a962e2c03"} Mar 20 15:42:45 crc kubenswrapper[4764]: I0320 15:42:45.984330 4764 generic.go:334] "Generic (PLEG): container finished" podID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" containerID="88be549125ea1dab6c6cb25e2b2c7ae01f4ca7910f64d83e87e8df8a962e2c03" exitCode=0 Mar 20 15:42:45 crc kubenswrapper[4764]: I0320 15:42:45.984816 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsdjr" event={"ID":"2069ec9d-5dc2-4a95-8254-16d6214a7e5b","Type":"ContainerDied","Data":"88be549125ea1dab6c6cb25e2b2c7ae01f4ca7910f64d83e87e8df8a962e2c03"} Mar 20 15:42:47 crc kubenswrapper[4764]: I0320 15:42:47.003916 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsdjr" event={"ID":"2069ec9d-5dc2-4a95-8254-16d6214a7e5b","Type":"ContainerStarted","Data":"444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7"} Mar 20 15:42:47 crc kubenswrapper[4764]: I0320 15:42:47.041232 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vsdjr" podStartSLOduration=3.311766041 podStartE2EDuration="8.041212196s" podCreationTimestamp="2026-03-20 15:42:39 +0000 UTC" firstStartedPulling="2026-03-20 15:42:41.945746065 +0000 UTC m=+3083.561935194" lastFinishedPulling="2026-03-20 15:42:46.6751922 +0000 UTC m=+3088.291381349" observedRunningTime="2026-03-20 15:42:47.023855679 +0000 UTC m=+3088.640044808" watchObservedRunningTime="2026-03-20 15:42:47.041212196 +0000 UTC m=+3088.657401325" Mar 20 15:42:48 crc kubenswrapper[4764]: I0320 15:42:48.127192 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:42:48 crc kubenswrapper[4764]: E0320 15:42:48.127487 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:42:50 crc kubenswrapper[4764]: I0320 15:42:50.324928 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:50 crc kubenswrapper[4764]: I0320 15:42:50.325232 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:42:51 crc kubenswrapper[4764]: I0320 15:42:51.372876 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vsdjr" podUID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" containerName="registry-server" probeResult="failure" output=< Mar 20 15:42:51 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 20 15:42:51 crc kubenswrapper[4764]: > Mar 20 15:42:59 crc kubenswrapper[4764]: I0320 15:42:59.133676 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:42:59 crc kubenswrapper[4764]: E0320 15:42:59.134619 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:43:00 crc kubenswrapper[4764]: I0320 15:43:00.372306 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:43:00 crc kubenswrapper[4764]: I0320 15:43:00.417146 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:43:00 crc kubenswrapper[4764]: I0320 15:43:00.607703 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vsdjr"] Mar 20 15:43:02 crc kubenswrapper[4764]: I0320 15:43:02.135976 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vsdjr" podUID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" containerName="registry-server" containerID="cri-o://444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7" gracePeriod=2 Mar 20 15:43:02 crc kubenswrapper[4764]: I0320 15:43:02.812610 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:43:02 crc kubenswrapper[4764]: I0320 15:43:02.961582 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-utilities\") pod \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\" (UID: \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\") " Mar 20 15:43:02 crc kubenswrapper[4764]: I0320 15:43:02.961645 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm7zd\" (UniqueName: \"kubernetes.io/projected/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-kube-api-access-pm7zd\") pod \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\" (UID: \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\") " Mar 20 15:43:02 crc kubenswrapper[4764]: I0320 15:43:02.961697 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-catalog-content\") pod \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\" (UID: \"2069ec9d-5dc2-4a95-8254-16d6214a7e5b\") " Mar 20 15:43:02 crc kubenswrapper[4764]: I0320 15:43:02.963168 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-utilities" (OuterVolumeSpecName: "utilities") pod "2069ec9d-5dc2-4a95-8254-16d6214a7e5b" (UID: "2069ec9d-5dc2-4a95-8254-16d6214a7e5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:43:02 crc kubenswrapper[4764]: I0320 15:43:02.980582 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-kube-api-access-pm7zd" (OuterVolumeSpecName: "kube-api-access-pm7zd") pod "2069ec9d-5dc2-4a95-8254-16d6214a7e5b" (UID: "2069ec9d-5dc2-4a95-8254-16d6214a7e5b"). InnerVolumeSpecName "kube-api-access-pm7zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.019424 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2069ec9d-5dc2-4a95-8254-16d6214a7e5b" (UID: "2069ec9d-5dc2-4a95-8254-16d6214a7e5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.063156 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.063183 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm7zd\" (UniqueName: \"kubernetes.io/projected/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-kube-api-access-pm7zd\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.063194 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2069ec9d-5dc2-4a95-8254-16d6214a7e5b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.143892 4764 generic.go:334] "Generic (PLEG): container finished" podID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" containerID="444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7" exitCode=0 Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.143946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsdjr" event={"ID":"2069ec9d-5dc2-4a95-8254-16d6214a7e5b","Type":"ContainerDied","Data":"444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7"} Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.143954 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsdjr" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.143978 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsdjr" event={"ID":"2069ec9d-5dc2-4a95-8254-16d6214a7e5b","Type":"ContainerDied","Data":"78ad24164689cfbd55d4f635e07aca6e58b48a5f5d94dfe63b1db2b3e5a4aaf8"} Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.144000 4764 scope.go:117] "RemoveContainer" containerID="444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.184981 4764 scope.go:117] "RemoveContainer" containerID="88be549125ea1dab6c6cb25e2b2c7ae01f4ca7910f64d83e87e8df8a962e2c03" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.185321 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vsdjr"] Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.194034 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vsdjr"] Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.208276 4764 scope.go:117] "RemoveContainer" containerID="06aa94f76635b533809c120842042aa574de528dca9c250e407b19ec9fc40677" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.244786 4764 scope.go:117] "RemoveContainer" containerID="444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7" Mar 20 15:43:03 crc kubenswrapper[4764]: E0320 15:43:03.246772 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7\": container with ID starting with 444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7 not found: ID does not exist" containerID="444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.247173 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7"} err="failed to get container status \"444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7\": rpc error: code = NotFound desc = could not find container \"444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7\": container with ID starting with 444e2644c8e4760f88f5cbd8575802a5c8e0797136e21abdfed6a835ce039df7 not found: ID does not exist" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.247202 4764 scope.go:117] "RemoveContainer" containerID="88be549125ea1dab6c6cb25e2b2c7ae01f4ca7910f64d83e87e8df8a962e2c03" Mar 20 15:43:03 crc kubenswrapper[4764]: E0320 15:43:03.247751 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88be549125ea1dab6c6cb25e2b2c7ae01f4ca7910f64d83e87e8df8a962e2c03\": container with ID starting with 88be549125ea1dab6c6cb25e2b2c7ae01f4ca7910f64d83e87e8df8a962e2c03 not found: ID does not exist" containerID="88be549125ea1dab6c6cb25e2b2c7ae01f4ca7910f64d83e87e8df8a962e2c03" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.247793 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88be549125ea1dab6c6cb25e2b2c7ae01f4ca7910f64d83e87e8df8a962e2c03"} err="failed to get container status \"88be549125ea1dab6c6cb25e2b2c7ae01f4ca7910f64d83e87e8df8a962e2c03\": rpc error: code = NotFound desc = could not find container \"88be549125ea1dab6c6cb25e2b2c7ae01f4ca7910f64d83e87e8df8a962e2c03\": container with ID starting with 88be549125ea1dab6c6cb25e2b2c7ae01f4ca7910f64d83e87e8df8a962e2c03 not found: ID does not exist" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.247821 4764 scope.go:117] "RemoveContainer" containerID="06aa94f76635b533809c120842042aa574de528dca9c250e407b19ec9fc40677" Mar 20 15:43:03 crc kubenswrapper[4764]: E0320 15:43:03.250341 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06aa94f76635b533809c120842042aa574de528dca9c250e407b19ec9fc40677\": container with ID starting with 06aa94f76635b533809c120842042aa574de528dca9c250e407b19ec9fc40677 not found: ID does not exist" containerID="06aa94f76635b533809c120842042aa574de528dca9c250e407b19ec9fc40677" Mar 20 15:43:03 crc kubenswrapper[4764]: I0320 15:43:03.250406 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06aa94f76635b533809c120842042aa574de528dca9c250e407b19ec9fc40677"} err="failed to get container status \"06aa94f76635b533809c120842042aa574de528dca9c250e407b19ec9fc40677\": rpc error: code = NotFound desc = could not find container \"06aa94f76635b533809c120842042aa574de528dca9c250e407b19ec9fc40677\": container with ID starting with 06aa94f76635b533809c120842042aa574de528dca9c250e407b19ec9fc40677 not found: ID does not exist" Mar 20 15:43:05 crc kubenswrapper[4764]: I0320 15:43:05.145316 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" path="/var/lib/kubelet/pods/2069ec9d-5dc2-4a95-8254-16d6214a7e5b/volumes" Mar 20 15:43:10 crc kubenswrapper[4764]: I0320 15:43:10.126745 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:43:10 crc kubenswrapper[4764]: E0320 15:43:10.127903 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:43:23 crc kubenswrapper[4764]: I0320 15:43:23.126711 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:43:23 crc kubenswrapper[4764]: E0320 15:43:23.127467 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:43:37 crc kubenswrapper[4764]: I0320 15:43:37.127697 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:43:37 crc kubenswrapper[4764]: E0320 15:43:37.128542 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:43:49 crc kubenswrapper[4764]: I0320 15:43:49.132001 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:43:49 crc kubenswrapper[4764]: E0320 15:43:49.132715 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.149348 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567024-pdf5w"] Mar 20 15:44:00 crc kubenswrapper[4764]: E0320 15:44:00.150360 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" containerName="extract-content" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.150436 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" containerName="extract-content" Mar 20 15:44:00 crc kubenswrapper[4764]: E0320 15:44:00.150470 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" containerName="registry-server" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.150479 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" containerName="registry-server" Mar 20 15:44:00 crc kubenswrapper[4764]: E0320 15:44:00.150510 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" containerName="extract-utilities" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.150519 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" containerName="extract-utilities" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.150773 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2069ec9d-5dc2-4a95-8254-16d6214a7e5b" containerName="registry-server" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.151465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567024-pdf5w" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.152947 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.153848 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.154086 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.172733 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567024-pdf5w"] Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.321819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24d2t\" (UniqueName: \"kubernetes.io/projected/0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a-kube-api-access-24d2t\") pod \"auto-csr-approver-29567024-pdf5w\" (UID: \"0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a\") " pod="openshift-infra/auto-csr-approver-29567024-pdf5w" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.423680 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24d2t\" (UniqueName: \"kubernetes.io/projected/0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a-kube-api-access-24d2t\") pod \"auto-csr-approver-29567024-pdf5w\" (UID: \"0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a\") " pod="openshift-infra/auto-csr-approver-29567024-pdf5w" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.440853 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24d2t\" (UniqueName: \"kubernetes.io/projected/0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a-kube-api-access-24d2t\") pod \"auto-csr-approver-29567024-pdf5w\" (UID: \"0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a\") " pod="openshift-infra/auto-csr-approver-29567024-pdf5w" Mar 20 15:44:00 crc kubenswrapper[4764]: I0320 15:44:00.488446 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567024-pdf5w" Mar 20 15:44:01 crc kubenswrapper[4764]: I0320 15:44:01.127443 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:44:01 crc kubenswrapper[4764]: E0320 15:44:01.128132 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:44:01 crc kubenswrapper[4764]: I0320 15:44:01.141929 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567024-pdf5w"] Mar 20 15:44:01 crc kubenswrapper[4764]: I0320 15:44:01.776344 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567024-pdf5w" event={"ID":"0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a","Type":"ContainerStarted","Data":"c10f90b80bab7cf993c5e9c91d36f24cf4ed0142e94787355861bcff044715ab"} Mar 20 15:44:03 crc kubenswrapper[4764]: I0320 15:44:03.794291 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567024-pdf5w" event={"ID":"0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a","Type":"ContainerStarted","Data":"d3fde9228a7d2b3589b8c5ec24a4cb33697f8eda68053f8a2c8f2993ca918522"} Mar 20 15:44:03 crc kubenswrapper[4764]: I0320 15:44:03.809046 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567024-pdf5w" podStartSLOduration=1.8001919750000002 podStartE2EDuration="3.809016504s" podCreationTimestamp="2026-03-20 15:44:00 +0000 UTC" firstStartedPulling="2026-03-20 15:44:01.141973998 +0000 UTC m=+3162.758163127" lastFinishedPulling="2026-03-20 15:44:03.150798527 +0000 UTC m=+3164.766987656" observedRunningTime="2026-03-20 15:44:03.807505627 +0000 UTC m=+3165.423694756" watchObservedRunningTime="2026-03-20 15:44:03.809016504 +0000 UTC m=+3165.425205633" Mar 20 15:44:05 crc kubenswrapper[4764]: I0320 15:44:05.814696 4764 generic.go:334] "Generic (PLEG): container finished" podID="0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a" containerID="d3fde9228a7d2b3589b8c5ec24a4cb33697f8eda68053f8a2c8f2993ca918522" exitCode=0 Mar 20 15:44:05 crc kubenswrapper[4764]: I0320 15:44:05.814798 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567024-pdf5w" event={"ID":"0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a","Type":"ContainerDied","Data":"d3fde9228a7d2b3589b8c5ec24a4cb33697f8eda68053f8a2c8f2993ca918522"} Mar 20 15:44:07 crc kubenswrapper[4764]: I0320 15:44:07.315104 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567024-pdf5w" Mar 20 15:44:07 crc kubenswrapper[4764]: I0320 15:44:07.459987 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24d2t\" (UniqueName: \"kubernetes.io/projected/0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a-kube-api-access-24d2t\") pod \"0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a\" (UID: \"0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a\") " Mar 20 15:44:07 crc kubenswrapper[4764]: I0320 15:44:07.474237 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a-kube-api-access-24d2t" (OuterVolumeSpecName: "kube-api-access-24d2t") pod "0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a" (UID: "0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a"). InnerVolumeSpecName "kube-api-access-24d2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:44:07 crc kubenswrapper[4764]: I0320 15:44:07.562250 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24d2t\" (UniqueName: \"kubernetes.io/projected/0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a-kube-api-access-24d2t\") on node \"crc\" DevicePath \"\"" Mar 20 15:44:07 crc kubenswrapper[4764]: I0320 15:44:07.830147 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567024-pdf5w" event={"ID":"0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a","Type":"ContainerDied","Data":"c10f90b80bab7cf993c5e9c91d36f24cf4ed0142e94787355861bcff044715ab"} Mar 20 15:44:07 crc kubenswrapper[4764]: I0320 15:44:07.830183 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c10f90b80bab7cf993c5e9c91d36f24cf4ed0142e94787355861bcff044715ab" Mar 20 15:44:07 crc kubenswrapper[4764]: I0320 15:44:07.830229 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567024-pdf5w" Mar 20 15:44:07 crc kubenswrapper[4764]: I0320 15:44:07.898833 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567018-dzfdx"] Mar 20 15:44:07 crc kubenswrapper[4764]: I0320 15:44:07.906335 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567018-dzfdx"] Mar 20 15:44:09 crc kubenswrapper[4764]: I0320 15:44:09.143700 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03568f2-d1ff-4c61-914a-65c9db1c5a43" path="/var/lib/kubelet/pods/c03568f2-d1ff-4c61-914a-65c9db1c5a43/volumes" Mar 20 15:44:10 crc kubenswrapper[4764]: I0320 15:44:10.689358 4764 scope.go:117] "RemoveContainer" containerID="365c7468fd6f1b3d1b051ca9dfbe0c3e7b2404ed6a5e0154d3c20f9fb4055643" Mar 20 15:44:16 crc kubenswrapper[4764]: I0320 15:44:16.126281 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:44:16 crc kubenswrapper[4764]: E0320 15:44:16.127466 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:44:30 crc kubenswrapper[4764]: I0320 15:44:30.126097 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:44:30 crc kubenswrapper[4764]: E0320 15:44:30.126984 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:44:41 crc kubenswrapper[4764]: I0320 15:44:41.126179 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:44:41 crc kubenswrapper[4764]: E0320 15:44:41.126966 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:44:52 crc kubenswrapper[4764]: I0320 15:44:52.127172 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:44:52 crc kubenswrapper[4764]: E0320 15:44:52.127965 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.151474 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8"] Mar 20 15:45:00 crc kubenswrapper[4764]: E0320 15:45:00.152248 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a" containerName="oc" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.152259 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a" containerName="oc" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.152453 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a" containerName="oc" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.153023 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.155294 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.156319 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.164338 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8"] Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.319714 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwrh5\" (UniqueName: \"kubernetes.io/projected/70a2dc7f-51c9-4afb-903d-d538d6330fa6-kube-api-access-wwrh5\") pod \"collect-profiles-29567025-m4bp8\" (UID: \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.319760 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70a2dc7f-51c9-4afb-903d-d538d6330fa6-secret-volume\") pod \"collect-profiles-29567025-m4bp8\" (UID: \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.320745 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70a2dc7f-51c9-4afb-903d-d538d6330fa6-config-volume\") pod \"collect-profiles-29567025-m4bp8\" (UID: \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.422535 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwrh5\" (UniqueName: \"kubernetes.io/projected/70a2dc7f-51c9-4afb-903d-d538d6330fa6-kube-api-access-wwrh5\") pod \"collect-profiles-29567025-m4bp8\" (UID: \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.422806 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70a2dc7f-51c9-4afb-903d-d538d6330fa6-secret-volume\") pod \"collect-profiles-29567025-m4bp8\" (UID: \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.423165 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70a2dc7f-51c9-4afb-903d-d538d6330fa6-config-volume\") pod \"collect-profiles-29567025-m4bp8\" (UID: \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.424139 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70a2dc7f-51c9-4afb-903d-d538d6330fa6-config-volume\") pod \"collect-profiles-29567025-m4bp8\" (UID: \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.428834 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70a2dc7f-51c9-4afb-903d-d538d6330fa6-secret-volume\") pod \"collect-profiles-29567025-m4bp8\" (UID: \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.440213 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwrh5\" (UniqueName: \"kubernetes.io/projected/70a2dc7f-51c9-4afb-903d-d538d6330fa6-kube-api-access-wwrh5\") pod \"collect-profiles-29567025-m4bp8\" (UID: \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.484789 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:00 crc kubenswrapper[4764]: I0320 15:45:00.975419 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8"] Mar 20 15:45:01 crc kubenswrapper[4764]: I0320 15:45:01.303218 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" event={"ID":"70a2dc7f-51c9-4afb-903d-d538d6330fa6","Type":"ContainerStarted","Data":"1f52ff4b334bbeeb7e29491351ff8b3a8e7adb962eadc540293a2ca957750626"} Mar 20 15:45:01 crc kubenswrapper[4764]: I0320 15:45:01.303551 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" event={"ID":"70a2dc7f-51c9-4afb-903d-d538d6330fa6","Type":"ContainerStarted","Data":"70e376be87e444b05e2914d3d84f309d501a23ac4485aafcc6cb0a299b150977"} Mar 20 15:45:01 crc kubenswrapper[4764]: I0320 15:45:01.324954 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" podStartSLOduration=1.324929667 podStartE2EDuration="1.324929667s" podCreationTimestamp="2026-03-20 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:45:01.317931021 +0000 UTC m=+3222.934120170" watchObservedRunningTime="2026-03-20 15:45:01.324929667 +0000 UTC m=+3222.941118796" Mar 20 15:45:02 crc kubenswrapper[4764]: I0320 15:45:02.316376 4764 generic.go:334] "Generic (PLEG): container finished" podID="70a2dc7f-51c9-4afb-903d-d538d6330fa6" containerID="1f52ff4b334bbeeb7e29491351ff8b3a8e7adb962eadc540293a2ca957750626" exitCode=0 Mar 20 15:45:02 crc kubenswrapper[4764]: I0320 15:45:02.316497 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" event={"ID":"70a2dc7f-51c9-4afb-903d-d538d6330fa6","Type":"ContainerDied","Data":"1f52ff4b334bbeeb7e29491351ff8b3a8e7adb962eadc540293a2ca957750626"} Mar 20 15:45:03 crc kubenswrapper[4764]: I0320 15:45:03.813238 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.000400 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwrh5\" (UniqueName: \"kubernetes.io/projected/70a2dc7f-51c9-4afb-903d-d538d6330fa6-kube-api-access-wwrh5\") pod \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\" (UID: \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\") " Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.000678 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70a2dc7f-51c9-4afb-903d-d538d6330fa6-config-volume\") pod \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\" (UID: \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\") " Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.000789 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70a2dc7f-51c9-4afb-903d-d538d6330fa6-secret-volume\") pod \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\" (UID: \"70a2dc7f-51c9-4afb-903d-d538d6330fa6\") " Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.001675 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70a2dc7f-51c9-4afb-903d-d538d6330fa6-config-volume" (OuterVolumeSpecName: "config-volume") pod "70a2dc7f-51c9-4afb-903d-d538d6330fa6" (UID: "70a2dc7f-51c9-4afb-903d-d538d6330fa6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.006612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70a2dc7f-51c9-4afb-903d-d538d6330fa6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "70a2dc7f-51c9-4afb-903d-d538d6330fa6" (UID: "70a2dc7f-51c9-4afb-903d-d538d6330fa6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.008130 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70a2dc7f-51c9-4afb-903d-d538d6330fa6-kube-api-access-wwrh5" (OuterVolumeSpecName: "kube-api-access-wwrh5") pod "70a2dc7f-51c9-4afb-903d-d538d6330fa6" (UID: "70a2dc7f-51c9-4afb-903d-d538d6330fa6"). InnerVolumeSpecName "kube-api-access-wwrh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.132880 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/70a2dc7f-51c9-4afb-903d-d538d6330fa6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.133205 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwrh5\" (UniqueName: \"kubernetes.io/projected/70a2dc7f-51c9-4afb-903d-d538d6330fa6-kube-api-access-wwrh5\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.133221 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/70a2dc7f-51c9-4afb-903d-d538d6330fa6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.337607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" event={"ID":"70a2dc7f-51c9-4afb-903d-d538d6330fa6","Type":"ContainerDied","Data":"70e376be87e444b05e2914d3d84f309d501a23ac4485aafcc6cb0a299b150977"} Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.337937 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70e376be87e444b05e2914d3d84f309d501a23ac4485aafcc6cb0a299b150977" Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.337666 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567025-m4bp8" Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.414984 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr"] Mar 20 15:45:04 crc kubenswrapper[4764]: I0320 15:45:04.424157 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566980-qm9kr"] Mar 20 15:45:05 crc kubenswrapper[4764]: I0320 15:45:05.126499 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:45:05 crc kubenswrapper[4764]: E0320 15:45:05.126846 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:45:05 crc kubenswrapper[4764]: I0320 15:45:05.144944 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1638bff-8fe7-45a5-a794-c40aa474724f" path="/var/lib/kubelet/pods/a1638bff-8fe7-45a5-a794-c40aa474724f/volumes" Mar 20 15:45:10 crc kubenswrapper[4764]: I0320 15:45:10.802454 4764 scope.go:117] "RemoveContainer" containerID="7046506c8f1bb640699d4c3a42e4cc8d33e686c3dcda026b300ab6c66fea7262" Mar 20 15:45:20 crc kubenswrapper[4764]: I0320 15:45:20.126789 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:45:20 crc kubenswrapper[4764]: E0320 15:45:20.127907 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:45:35 crc kubenswrapper[4764]: I0320 15:45:35.127038 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:45:35 crc kubenswrapper[4764]: E0320 15:45:35.127946 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:45:50 crc kubenswrapper[4764]: I0320 15:45:50.127031 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:45:50 crc kubenswrapper[4764]: I0320 15:45:50.758819 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"10384472b77ab8f5ced352ce2637ccaa4865d4dbf17c39394007213676c97e63"} Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.177542 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567026-qvcm6"] Mar 20 15:46:00 crc kubenswrapper[4764]: E0320 15:46:00.178521 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70a2dc7f-51c9-4afb-903d-d538d6330fa6" containerName="collect-profiles" Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.178537 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="70a2dc7f-51c9-4afb-903d-d538d6330fa6" containerName="collect-profiles" Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.178725 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="70a2dc7f-51c9-4afb-903d-d538d6330fa6" containerName="collect-profiles" Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.179303 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567026-qvcm6" Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.181706 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.181886 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.183079 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.195023 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567026-qvcm6"] Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.259092 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv4vz\" (UniqueName: \"kubernetes.io/projected/0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e-kube-api-access-sv4vz\") pod \"auto-csr-approver-29567026-qvcm6\" (UID: \"0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e\") " pod="openshift-infra/auto-csr-approver-29567026-qvcm6" Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.360699 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv4vz\" (UniqueName: \"kubernetes.io/projected/0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e-kube-api-access-sv4vz\") pod \"auto-csr-approver-29567026-qvcm6\" (UID: \"0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e\") " pod="openshift-infra/auto-csr-approver-29567026-qvcm6" Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.381814 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv4vz\" (UniqueName: \"kubernetes.io/projected/0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e-kube-api-access-sv4vz\") pod \"auto-csr-approver-29567026-qvcm6\" (UID: \"0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e\") " pod="openshift-infra/auto-csr-approver-29567026-qvcm6" Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.505814 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567026-qvcm6" Mar 20 15:46:00 crc kubenswrapper[4764]: I0320 15:46:00.953634 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567026-qvcm6"] Mar 20 15:46:01 crc kubenswrapper[4764]: I0320 15:46:01.861877 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567026-qvcm6" event={"ID":"0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e","Type":"ContainerStarted","Data":"da1a398bfb1749bb8fa7b0de64568b6b8b1ff046b3f22e8ba87e6c9e3603cd14"} Mar 20 15:46:02 crc kubenswrapper[4764]: I0320 15:46:02.873034 4764 generic.go:334] "Generic (PLEG): container finished" podID="0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e" containerID="e05561671954527ad8408ec61f1ddd51537d0325b5c28512d8a11073a03c118d" exitCode=0 Mar 20 15:46:02 crc kubenswrapper[4764]: I0320 15:46:02.873147 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567026-qvcm6" event={"ID":"0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e","Type":"ContainerDied","Data":"e05561671954527ad8408ec61f1ddd51537d0325b5c28512d8a11073a03c118d"} Mar 20 15:46:04 crc kubenswrapper[4764]: I0320 15:46:04.364541 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567026-qvcm6" Mar 20 15:46:04 crc kubenswrapper[4764]: I0320 15:46:04.538910 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv4vz\" (UniqueName: \"kubernetes.io/projected/0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e-kube-api-access-sv4vz\") pod \"0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e\" (UID: \"0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e\") " Mar 20 15:46:04 crc kubenswrapper[4764]: I0320 15:46:04.544118 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e-kube-api-access-sv4vz" (OuterVolumeSpecName: "kube-api-access-sv4vz") pod "0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e" (UID: "0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e"). InnerVolumeSpecName "kube-api-access-sv4vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:46:04 crc kubenswrapper[4764]: I0320 15:46:04.641475 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv4vz\" (UniqueName: \"kubernetes.io/projected/0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e-kube-api-access-sv4vz\") on node \"crc\" DevicePath \"\"" Mar 20 15:46:04 crc kubenswrapper[4764]: I0320 15:46:04.892794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567026-qvcm6" event={"ID":"0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e","Type":"ContainerDied","Data":"da1a398bfb1749bb8fa7b0de64568b6b8b1ff046b3f22e8ba87e6c9e3603cd14"} Mar 20 15:46:04 crc kubenswrapper[4764]: I0320 15:46:04.892828 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da1a398bfb1749bb8fa7b0de64568b6b8b1ff046b3f22e8ba87e6c9e3603cd14" Mar 20 15:46:04 crc kubenswrapper[4764]: I0320 15:46:04.892881 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567026-qvcm6" Mar 20 15:46:05 crc kubenswrapper[4764]: I0320 15:46:05.450052 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567020-65xzc"] Mar 20 15:46:05 crc kubenswrapper[4764]: I0320 15:46:05.461600 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567020-65xzc"] Mar 20 15:46:07 crc kubenswrapper[4764]: I0320 15:46:07.141877 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ae95724-7309-46a9-9301-6c2a922fc880" path="/var/lib/kubelet/pods/2ae95724-7309-46a9-9301-6c2a922fc880/volumes" Mar 20 15:46:10 crc kubenswrapper[4764]: I0320 15:46:10.890482 4764 scope.go:117] "RemoveContainer" containerID="18b43d8fc974921fc9801467e86580163a9dfb570ffc264417ce62e331d6cf01" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.764615 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wbkvz"] Mar 20 15:46:57 crc kubenswrapper[4764]: E0320 15:46:57.765566 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e" containerName="oc" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.765581 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e" containerName="oc" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.765812 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e" containerName="oc" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.769549 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.811350 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wbkvz"] Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.830886 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5886f9-ebdb-43aa-a826-6dd12d1ac400-catalog-content\") pod \"community-operators-wbkvz\" (UID: \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\") " pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.830934 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5886f9-ebdb-43aa-a826-6dd12d1ac400-utilities\") pod \"community-operators-wbkvz\" (UID: \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\") " pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.831001 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8sxf\" (UniqueName: \"kubernetes.io/projected/df5886f9-ebdb-43aa-a826-6dd12d1ac400-kube-api-access-c8sxf\") pod \"community-operators-wbkvz\" (UID: \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\") " pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.932552 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5886f9-ebdb-43aa-a826-6dd12d1ac400-utilities\") pod \"community-operators-wbkvz\" (UID: \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\") " pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.932852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8sxf\" (UniqueName: \"kubernetes.io/projected/df5886f9-ebdb-43aa-a826-6dd12d1ac400-kube-api-access-c8sxf\") pod \"community-operators-wbkvz\" (UID: \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\") " pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.933069 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5886f9-ebdb-43aa-a826-6dd12d1ac400-catalog-content\") pod \"community-operators-wbkvz\" (UID: \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\") " pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.933689 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5886f9-ebdb-43aa-a826-6dd12d1ac400-catalog-content\") pod \"community-operators-wbkvz\" (UID: \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\") " pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.933923 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5886f9-ebdb-43aa-a826-6dd12d1ac400-utilities\") pod \"community-operators-wbkvz\" (UID: \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\") " pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:46:57 crc kubenswrapper[4764]: I0320 15:46:57.951549 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8sxf\" (UniqueName: \"kubernetes.io/projected/df5886f9-ebdb-43aa-a826-6dd12d1ac400-kube-api-access-c8sxf\") pod \"community-operators-wbkvz\" (UID: \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\") " pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:46:58 crc kubenswrapper[4764]: I0320 15:46:58.090698 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:46:58 crc kubenswrapper[4764]: I0320 15:46:58.648557 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wbkvz"] Mar 20 15:46:59 crc kubenswrapper[4764]: I0320 15:46:59.425553 4764 generic.go:334] "Generic (PLEG): container finished" podID="df5886f9-ebdb-43aa-a826-6dd12d1ac400" containerID="a0e54de82ca36bae6fbc1cf0141d5375d84e43885fb22fef62a82cbdb0e43381" exitCode=0 Mar 20 15:46:59 crc kubenswrapper[4764]: I0320 15:46:59.425613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbkvz" event={"ID":"df5886f9-ebdb-43aa-a826-6dd12d1ac400","Type":"ContainerDied","Data":"a0e54de82ca36bae6fbc1cf0141d5375d84e43885fb22fef62a82cbdb0e43381"} Mar 20 15:46:59 crc kubenswrapper[4764]: I0320 15:46:59.426077 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbkvz" event={"ID":"df5886f9-ebdb-43aa-a826-6dd12d1ac400","Type":"ContainerStarted","Data":"19272a7dd03318d57b17dc2ca9178edff3e11e4c0dc22358bdcd9d04da53dac9"} Mar 20 15:47:00 crc kubenswrapper[4764]: I0320 15:47:00.438129 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbkvz" event={"ID":"df5886f9-ebdb-43aa-a826-6dd12d1ac400","Type":"ContainerStarted","Data":"6050793e9c7f54155770acb51a543372474f35b8f9fce7147d756508463a160f"} Mar 20 15:47:02 crc kubenswrapper[4764]: I0320 15:47:02.461732 4764 generic.go:334] "Generic (PLEG): container finished" podID="df5886f9-ebdb-43aa-a826-6dd12d1ac400" containerID="6050793e9c7f54155770acb51a543372474f35b8f9fce7147d756508463a160f" exitCode=0 Mar 20 15:47:02 crc kubenswrapper[4764]: I0320 15:47:02.461779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbkvz" event={"ID":"df5886f9-ebdb-43aa-a826-6dd12d1ac400","Type":"ContainerDied","Data":"6050793e9c7f54155770acb51a543372474f35b8f9fce7147d756508463a160f"} Mar 20 15:47:03 crc kubenswrapper[4764]: I0320 15:47:03.473360 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbkvz" event={"ID":"df5886f9-ebdb-43aa-a826-6dd12d1ac400","Type":"ContainerStarted","Data":"06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245"} Mar 20 15:47:03 crc kubenswrapper[4764]: I0320 15:47:03.494938 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wbkvz" podStartSLOduration=3.037124112 podStartE2EDuration="6.494917434s" podCreationTimestamp="2026-03-20 15:46:57 +0000 UTC" firstStartedPulling="2026-03-20 15:46:59.427631595 +0000 UTC m=+3341.043820754" lastFinishedPulling="2026-03-20 15:47:02.885424947 +0000 UTC m=+3344.501614076" observedRunningTime="2026-03-20 15:47:03.492303384 +0000 UTC m=+3345.108492523" watchObservedRunningTime="2026-03-20 15:47:03.494917434 +0000 UTC m=+3345.111106573" Mar 20 15:47:08 crc kubenswrapper[4764]: I0320 15:47:08.091780 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:47:08 crc kubenswrapper[4764]: I0320 15:47:08.092248 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:47:08 crc kubenswrapper[4764]: I0320 15:47:08.153623 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:47:08 crc kubenswrapper[4764]: I0320 15:47:08.557007 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:47:08 crc kubenswrapper[4764]: I0320 15:47:08.605098 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wbkvz"] Mar 20 15:47:10 crc kubenswrapper[4764]: I0320 15:47:10.523368 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wbkvz" podUID="df5886f9-ebdb-43aa-a826-6dd12d1ac400" containerName="registry-server" containerID="cri-o://06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245" gracePeriod=2 Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.149337 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.321967 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8sxf\" (UniqueName: \"kubernetes.io/projected/df5886f9-ebdb-43aa-a826-6dd12d1ac400-kube-api-access-c8sxf\") pod \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\" (UID: \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\") " Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.322188 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5886f9-ebdb-43aa-a826-6dd12d1ac400-utilities\") pod \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\" (UID: \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\") " Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.322296 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5886f9-ebdb-43aa-a826-6dd12d1ac400-catalog-content\") pod \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\" (UID: \"df5886f9-ebdb-43aa-a826-6dd12d1ac400\") " Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.323775 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5886f9-ebdb-43aa-a826-6dd12d1ac400-utilities" (OuterVolumeSpecName: "utilities") pod "df5886f9-ebdb-43aa-a826-6dd12d1ac400" (UID: "df5886f9-ebdb-43aa-a826-6dd12d1ac400"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.329316 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5886f9-ebdb-43aa-a826-6dd12d1ac400-kube-api-access-c8sxf" (OuterVolumeSpecName: "kube-api-access-c8sxf") pod "df5886f9-ebdb-43aa-a826-6dd12d1ac400" (UID: "df5886f9-ebdb-43aa-a826-6dd12d1ac400"). InnerVolumeSpecName "kube-api-access-c8sxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.371101 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5886f9-ebdb-43aa-a826-6dd12d1ac400-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df5886f9-ebdb-43aa-a826-6dd12d1ac400" (UID: "df5886f9-ebdb-43aa-a826-6dd12d1ac400"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.425355 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5886f9-ebdb-43aa-a826-6dd12d1ac400-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.425400 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5886f9-ebdb-43aa-a826-6dd12d1ac400-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.425414 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8sxf\" (UniqueName: \"kubernetes.io/projected/df5886f9-ebdb-43aa-a826-6dd12d1ac400-kube-api-access-c8sxf\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.534639 4764 generic.go:334] "Generic (PLEG): container finished" podID="df5886f9-ebdb-43aa-a826-6dd12d1ac400" containerID="06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245" exitCode=0 Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.534689 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbkvz" event={"ID":"df5886f9-ebdb-43aa-a826-6dd12d1ac400","Type":"ContainerDied","Data":"06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245"} Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.534719 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wbkvz" event={"ID":"df5886f9-ebdb-43aa-a826-6dd12d1ac400","Type":"ContainerDied","Data":"19272a7dd03318d57b17dc2ca9178edff3e11e4c0dc22358bdcd9d04da53dac9"} Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.534739 4764 scope.go:117] "RemoveContainer" containerID="06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.534757 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wbkvz" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.565829 4764 scope.go:117] "RemoveContainer" containerID="6050793e9c7f54155770acb51a543372474f35b8f9fce7147d756508463a160f" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.579846 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wbkvz"] Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.587975 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wbkvz"] Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.594912 4764 scope.go:117] "RemoveContainer" containerID="a0e54de82ca36bae6fbc1cf0141d5375d84e43885fb22fef62a82cbdb0e43381" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.635301 4764 scope.go:117] "RemoveContainer" containerID="06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245" Mar 20 15:47:11 crc kubenswrapper[4764]: E0320 15:47:11.635839 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245\": container with ID starting with 06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245 not found: ID does not exist" containerID="06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.635875 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245"} err="failed to get container status \"06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245\": rpc error: code = NotFound desc = could not find container \"06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245\": container with ID starting with 06b1dca3fdc807da67667bb7acd4b8b35e0aa20d459a0fe6060bd968394e7245 not found: ID does not exist" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.635894 4764 scope.go:117] "RemoveContainer" containerID="6050793e9c7f54155770acb51a543372474f35b8f9fce7147d756508463a160f" Mar 20 15:47:11 crc kubenswrapper[4764]: E0320 15:47:11.636283 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6050793e9c7f54155770acb51a543372474f35b8f9fce7147d756508463a160f\": container with ID starting with 6050793e9c7f54155770acb51a543372474f35b8f9fce7147d756508463a160f not found: ID does not exist" containerID="6050793e9c7f54155770acb51a543372474f35b8f9fce7147d756508463a160f" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.636324 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6050793e9c7f54155770acb51a543372474f35b8f9fce7147d756508463a160f"} err="failed to get container status \"6050793e9c7f54155770acb51a543372474f35b8f9fce7147d756508463a160f\": rpc error: code = NotFound desc = could not find container \"6050793e9c7f54155770acb51a543372474f35b8f9fce7147d756508463a160f\": container with ID starting with 6050793e9c7f54155770acb51a543372474f35b8f9fce7147d756508463a160f not found: ID does not exist" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.636598 4764 scope.go:117] "RemoveContainer" containerID="a0e54de82ca36bae6fbc1cf0141d5375d84e43885fb22fef62a82cbdb0e43381" Mar 20 15:47:11 crc kubenswrapper[4764]: E0320 15:47:11.637162 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e54de82ca36bae6fbc1cf0141d5375d84e43885fb22fef62a82cbdb0e43381\": container with ID starting with a0e54de82ca36bae6fbc1cf0141d5375d84e43885fb22fef62a82cbdb0e43381 not found: ID does not exist" containerID="a0e54de82ca36bae6fbc1cf0141d5375d84e43885fb22fef62a82cbdb0e43381" Mar 20 15:47:11 crc kubenswrapper[4764]: I0320 15:47:11.637212 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e54de82ca36bae6fbc1cf0141d5375d84e43885fb22fef62a82cbdb0e43381"} err="failed to get container status \"a0e54de82ca36bae6fbc1cf0141d5375d84e43885fb22fef62a82cbdb0e43381\": rpc error: code = NotFound desc = could not find container \"a0e54de82ca36bae6fbc1cf0141d5375d84e43885fb22fef62a82cbdb0e43381\": container with ID starting with a0e54de82ca36bae6fbc1cf0141d5375d84e43885fb22fef62a82cbdb0e43381 not found: ID does not exist" Mar 20 15:47:13 crc kubenswrapper[4764]: I0320 15:47:13.136741 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5886f9-ebdb-43aa-a826-6dd12d1ac400" path="/var/lib/kubelet/pods/df5886f9-ebdb-43aa-a826-6dd12d1ac400/volumes" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.625314 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nxdm4"] Mar 20 15:47:24 crc kubenswrapper[4764]: E0320 15:47:24.627448 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5886f9-ebdb-43aa-a826-6dd12d1ac400" containerName="registry-server" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.627485 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5886f9-ebdb-43aa-a826-6dd12d1ac400" containerName="registry-server" Mar 20 15:47:24 crc kubenswrapper[4764]: E0320 15:47:24.627528 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5886f9-ebdb-43aa-a826-6dd12d1ac400" containerName="extract-utilities" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.627536 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5886f9-ebdb-43aa-a826-6dd12d1ac400" containerName="extract-utilities" Mar 20 15:47:24 crc kubenswrapper[4764]: E0320 15:47:24.627556 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5886f9-ebdb-43aa-a826-6dd12d1ac400" containerName="extract-content" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.627563 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5886f9-ebdb-43aa-a826-6dd12d1ac400" containerName="extract-content" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.627868 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5886f9-ebdb-43aa-a826-6dd12d1ac400" containerName="registry-server" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.629606 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.646778 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nxdm4"] Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.771766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30ed405b-ba46-449d-a913-c44a86a7d418-utilities\") pod \"redhat-marketplace-nxdm4\" (UID: \"30ed405b-ba46-449d-a913-c44a86a7d418\") " pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.771851 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmgqc\" (UniqueName: \"kubernetes.io/projected/30ed405b-ba46-449d-a913-c44a86a7d418-kube-api-access-jmgqc\") pod \"redhat-marketplace-nxdm4\" (UID: \"30ed405b-ba46-449d-a913-c44a86a7d418\") " pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.772807 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30ed405b-ba46-449d-a913-c44a86a7d418-catalog-content\") pod \"redhat-marketplace-nxdm4\" (UID: \"30ed405b-ba46-449d-a913-c44a86a7d418\") " pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.874061 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30ed405b-ba46-449d-a913-c44a86a7d418-catalog-content\") pod \"redhat-marketplace-nxdm4\" (UID: \"30ed405b-ba46-449d-a913-c44a86a7d418\") " pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.874311 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30ed405b-ba46-449d-a913-c44a86a7d418-utilities\") pod \"redhat-marketplace-nxdm4\" (UID: \"30ed405b-ba46-449d-a913-c44a86a7d418\") " pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.874468 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmgqc\" (UniqueName: \"kubernetes.io/projected/30ed405b-ba46-449d-a913-c44a86a7d418-kube-api-access-jmgqc\") pod \"redhat-marketplace-nxdm4\" (UID: \"30ed405b-ba46-449d-a913-c44a86a7d418\") " pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.874590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30ed405b-ba46-449d-a913-c44a86a7d418-catalog-content\") pod \"redhat-marketplace-nxdm4\" (UID: \"30ed405b-ba46-449d-a913-c44a86a7d418\") " pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.874828 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30ed405b-ba46-449d-a913-c44a86a7d418-utilities\") pod \"redhat-marketplace-nxdm4\" (UID: \"30ed405b-ba46-449d-a913-c44a86a7d418\") " pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.900125 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmgqc\" (UniqueName: \"kubernetes.io/projected/30ed405b-ba46-449d-a913-c44a86a7d418-kube-api-access-jmgqc\") pod \"redhat-marketplace-nxdm4\" (UID: \"30ed405b-ba46-449d-a913-c44a86a7d418\") " pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:24 crc kubenswrapper[4764]: I0320 15:47:24.959592 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:25 crc kubenswrapper[4764]: I0320 15:47:25.441540 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nxdm4"] Mar 20 15:47:25 crc kubenswrapper[4764]: I0320 15:47:25.670518 4764 generic.go:334] "Generic (PLEG): container finished" podID="30ed405b-ba46-449d-a913-c44a86a7d418" containerID="4823eeac105396686d1995effd2b2aca78342cdfdbfaf3ea580cce5fc71e6492" exitCode=0 Mar 20 15:47:25 crc kubenswrapper[4764]: I0320 15:47:25.670617 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxdm4" event={"ID":"30ed405b-ba46-449d-a913-c44a86a7d418","Type":"ContainerDied","Data":"4823eeac105396686d1995effd2b2aca78342cdfdbfaf3ea580cce5fc71e6492"} Mar 20 15:47:25 crc kubenswrapper[4764]: I0320 15:47:25.670767 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxdm4" event={"ID":"30ed405b-ba46-449d-a913-c44a86a7d418","Type":"ContainerStarted","Data":"6a7f2567c793faf2042db9ef339675e5b63848b6ad68d697909880471f1b5402"} Mar 20 15:47:26 crc kubenswrapper[4764]: I0320 15:47:26.681311 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxdm4" event={"ID":"30ed405b-ba46-449d-a913-c44a86a7d418","Type":"ContainerStarted","Data":"0f97e3544d6e1a235a3eff6756bf71261bb202d4f3c168f462947906c5686342"} Mar 20 15:47:27 crc kubenswrapper[4764]: I0320 15:47:27.693308 4764 generic.go:334] "Generic (PLEG): container finished" podID="30ed405b-ba46-449d-a913-c44a86a7d418" containerID="0f97e3544d6e1a235a3eff6756bf71261bb202d4f3c168f462947906c5686342" exitCode=0 Mar 20 15:47:27 crc kubenswrapper[4764]: I0320 15:47:27.693368 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxdm4" event={"ID":"30ed405b-ba46-449d-a913-c44a86a7d418","Type":"ContainerDied","Data":"0f97e3544d6e1a235a3eff6756bf71261bb202d4f3c168f462947906c5686342"} Mar 20 15:47:28 crc kubenswrapper[4764]: I0320 15:47:28.704404 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxdm4" event={"ID":"30ed405b-ba46-449d-a913-c44a86a7d418","Type":"ContainerStarted","Data":"6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6"} Mar 20 15:47:28 crc kubenswrapper[4764]: I0320 15:47:28.726734 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nxdm4" podStartSLOduration=2.264485135 podStartE2EDuration="4.726720513s" podCreationTimestamp="2026-03-20 15:47:24 +0000 UTC" firstStartedPulling="2026-03-20 15:47:25.672517078 +0000 UTC m=+3367.288706207" lastFinishedPulling="2026-03-20 15:47:28.134752456 +0000 UTC m=+3369.750941585" observedRunningTime="2026-03-20 15:47:28.719545861 +0000 UTC m=+3370.335734990" watchObservedRunningTime="2026-03-20 15:47:28.726720513 +0000 UTC m=+3370.342909642" Mar 20 15:47:34 crc kubenswrapper[4764]: I0320 15:47:34.960815 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:34 crc kubenswrapper[4764]: I0320 15:47:34.961301 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:35 crc kubenswrapper[4764]: I0320 15:47:35.024082 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:35 crc kubenswrapper[4764]: I0320 15:47:35.805251 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:35 crc kubenswrapper[4764]: I0320 15:47:35.861685 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nxdm4"] Mar 20 15:47:37 crc kubenswrapper[4764]: I0320 15:47:37.774045 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nxdm4" podUID="30ed405b-ba46-449d-a913-c44a86a7d418" containerName="registry-server" containerID="cri-o://6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6" gracePeriod=2 Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.370682 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.553683 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmgqc\" (UniqueName: \"kubernetes.io/projected/30ed405b-ba46-449d-a913-c44a86a7d418-kube-api-access-jmgqc\") pod \"30ed405b-ba46-449d-a913-c44a86a7d418\" (UID: \"30ed405b-ba46-449d-a913-c44a86a7d418\") " Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.554098 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30ed405b-ba46-449d-a913-c44a86a7d418-utilities\") pod \"30ed405b-ba46-449d-a913-c44a86a7d418\" (UID: \"30ed405b-ba46-449d-a913-c44a86a7d418\") " Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.554147 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30ed405b-ba46-449d-a913-c44a86a7d418-catalog-content\") pod \"30ed405b-ba46-449d-a913-c44a86a7d418\" (UID: \"30ed405b-ba46-449d-a913-c44a86a7d418\") " Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.554806 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30ed405b-ba46-449d-a913-c44a86a7d418-utilities" (OuterVolumeSpecName: "utilities") pod "30ed405b-ba46-449d-a913-c44a86a7d418" (UID: "30ed405b-ba46-449d-a913-c44a86a7d418"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.568928 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ed405b-ba46-449d-a913-c44a86a7d418-kube-api-access-jmgqc" (OuterVolumeSpecName: "kube-api-access-jmgqc") pod "30ed405b-ba46-449d-a913-c44a86a7d418" (UID: "30ed405b-ba46-449d-a913-c44a86a7d418"). InnerVolumeSpecName "kube-api-access-jmgqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.580490 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30ed405b-ba46-449d-a913-c44a86a7d418-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30ed405b-ba46-449d-a913-c44a86a7d418" (UID: "30ed405b-ba46-449d-a913-c44a86a7d418"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.656584 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30ed405b-ba46-449d-a913-c44a86a7d418-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.656620 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30ed405b-ba46-449d-a913-c44a86a7d418-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.656667 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmgqc\" (UniqueName: \"kubernetes.io/projected/30ed405b-ba46-449d-a913-c44a86a7d418-kube-api-access-jmgqc\") on node \"crc\" DevicePath \"\"" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.785576 4764 generic.go:334] "Generic (PLEG): container finished" podID="30ed405b-ba46-449d-a913-c44a86a7d418" containerID="6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6" exitCode=0 Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.785619 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxdm4" event={"ID":"30ed405b-ba46-449d-a913-c44a86a7d418","Type":"ContainerDied","Data":"6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6"} Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.785644 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nxdm4" event={"ID":"30ed405b-ba46-449d-a913-c44a86a7d418","Type":"ContainerDied","Data":"6a7f2567c793faf2042db9ef339675e5b63848b6ad68d697909880471f1b5402"} Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.785661 4764 scope.go:117] "RemoveContainer" containerID="6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.785787 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nxdm4" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.806949 4764 scope.go:117] "RemoveContainer" containerID="0f97e3544d6e1a235a3eff6756bf71261bb202d4f3c168f462947906c5686342" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.827420 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nxdm4"] Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.838677 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nxdm4"] Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.844872 4764 scope.go:117] "RemoveContainer" containerID="4823eeac105396686d1995effd2b2aca78342cdfdbfaf3ea580cce5fc71e6492" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.871310 4764 scope.go:117] "RemoveContainer" containerID="6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6" Mar 20 15:47:38 crc kubenswrapper[4764]: E0320 15:47:38.871776 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6\": container with ID starting with 6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6 not found: ID does not exist" containerID="6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.871892 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6"} err="failed to get container status \"6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6\": rpc error: code = NotFound desc = could not find container \"6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6\": container with ID starting with 6b0075284d5fd1db6651b1d212ccfbb7610e11eca4df89f0cb0546f50b0395b6 not found: ID does not exist" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.871978 4764 scope.go:117] "RemoveContainer" containerID="0f97e3544d6e1a235a3eff6756bf71261bb202d4f3c168f462947906c5686342" Mar 20 15:47:38 crc kubenswrapper[4764]: E0320 15:47:38.872357 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f97e3544d6e1a235a3eff6756bf71261bb202d4f3c168f462947906c5686342\": container with ID starting with 0f97e3544d6e1a235a3eff6756bf71261bb202d4f3c168f462947906c5686342 not found: ID does not exist" containerID="0f97e3544d6e1a235a3eff6756bf71261bb202d4f3c168f462947906c5686342" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.872397 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f97e3544d6e1a235a3eff6756bf71261bb202d4f3c168f462947906c5686342"} err="failed to get container status \"0f97e3544d6e1a235a3eff6756bf71261bb202d4f3c168f462947906c5686342\": rpc error: code = NotFound desc = could not find container \"0f97e3544d6e1a235a3eff6756bf71261bb202d4f3c168f462947906c5686342\": container with ID starting with 0f97e3544d6e1a235a3eff6756bf71261bb202d4f3c168f462947906c5686342 not found: ID does not exist" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.872420 4764 scope.go:117] "RemoveContainer" containerID="4823eeac105396686d1995effd2b2aca78342cdfdbfaf3ea580cce5fc71e6492" Mar 20 15:47:38 crc kubenswrapper[4764]: E0320 15:47:38.872704 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4823eeac105396686d1995effd2b2aca78342cdfdbfaf3ea580cce5fc71e6492\": container with ID starting with 4823eeac105396686d1995effd2b2aca78342cdfdbfaf3ea580cce5fc71e6492 not found: ID does not exist" containerID="4823eeac105396686d1995effd2b2aca78342cdfdbfaf3ea580cce5fc71e6492" Mar 20 15:47:38 crc kubenswrapper[4764]: I0320 15:47:38.872724 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4823eeac105396686d1995effd2b2aca78342cdfdbfaf3ea580cce5fc71e6492"} err="failed to get container status \"4823eeac105396686d1995effd2b2aca78342cdfdbfaf3ea580cce5fc71e6492\": rpc error: code = NotFound desc = could not find container \"4823eeac105396686d1995effd2b2aca78342cdfdbfaf3ea580cce5fc71e6492\": container with ID starting with 4823eeac105396686d1995effd2b2aca78342cdfdbfaf3ea580cce5fc71e6492 not found: ID does not exist" Mar 20 15:47:39 crc kubenswrapper[4764]: I0320 15:47:39.136535 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ed405b-ba46-449d-a913-c44a86a7d418" path="/var/lib/kubelet/pods/30ed405b-ba46-449d-a913-c44a86a7d418/volumes" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.144848 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567028-v4vr7"] Mar 20 15:48:00 crc kubenswrapper[4764]: E0320 15:48:00.145713 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ed405b-ba46-449d-a913-c44a86a7d418" containerName="registry-server" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.145727 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ed405b-ba46-449d-a913-c44a86a7d418" containerName="registry-server" Mar 20 15:48:00 crc kubenswrapper[4764]: E0320 15:48:00.145747 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ed405b-ba46-449d-a913-c44a86a7d418" containerName="extract-utilities" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.145755 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ed405b-ba46-449d-a913-c44a86a7d418" containerName="extract-utilities" Mar 20 15:48:00 crc kubenswrapper[4764]: E0320 15:48:00.145781 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ed405b-ba46-449d-a913-c44a86a7d418" containerName="extract-content" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.145789 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ed405b-ba46-449d-a913-c44a86a7d418" containerName="extract-content" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.149226 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ed405b-ba46-449d-a913-c44a86a7d418" containerName="registry-server" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.159791 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567028-v4vr7" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.183044 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.183296 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.183492 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.214020 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk9m6\" (UniqueName: \"kubernetes.io/projected/7754fa43-1f4e-413a-be02-fe767c0eeb75-kube-api-access-bk9m6\") pod \"auto-csr-approver-29567028-v4vr7\" (UID: \"7754fa43-1f4e-413a-be02-fe767c0eeb75\") " pod="openshift-infra/auto-csr-approver-29567028-v4vr7" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.232714 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567028-v4vr7"] Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.315810 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk9m6\" (UniqueName: \"kubernetes.io/projected/7754fa43-1f4e-413a-be02-fe767c0eeb75-kube-api-access-bk9m6\") pod \"auto-csr-approver-29567028-v4vr7\" (UID: \"7754fa43-1f4e-413a-be02-fe767c0eeb75\") " pod="openshift-infra/auto-csr-approver-29567028-v4vr7" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.344934 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk9m6\" (UniqueName: \"kubernetes.io/projected/7754fa43-1f4e-413a-be02-fe767c0eeb75-kube-api-access-bk9m6\") pod \"auto-csr-approver-29567028-v4vr7\" (UID: \"7754fa43-1f4e-413a-be02-fe767c0eeb75\") " pod="openshift-infra/auto-csr-approver-29567028-v4vr7" Mar 20 15:48:00 crc kubenswrapper[4764]: I0320 15:48:00.540509 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567028-v4vr7" Mar 20 15:48:01 crc kubenswrapper[4764]: I0320 15:48:01.002422 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567028-v4vr7"] Mar 20 15:48:01 crc kubenswrapper[4764]: I0320 15:48:01.003764 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:48:02 crc kubenswrapper[4764]: I0320 15:48:02.002117 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567028-v4vr7" event={"ID":"7754fa43-1f4e-413a-be02-fe767c0eeb75","Type":"ContainerStarted","Data":"a010a100564b1171cab200998a0e6daf1d3cf1783a8204ecc24589263e7eb893"} Mar 20 15:48:03 crc kubenswrapper[4764]: I0320 15:48:03.012317 4764 generic.go:334] "Generic (PLEG): container finished" podID="7754fa43-1f4e-413a-be02-fe767c0eeb75" containerID="3e4d0dbb012b759ceacb498f1497e71ce18858a4f32749b28452de2cfc5d7c57" exitCode=0 Mar 20 15:48:03 crc kubenswrapper[4764]: I0320 15:48:03.012450 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567028-v4vr7" event={"ID":"7754fa43-1f4e-413a-be02-fe767c0eeb75","Type":"ContainerDied","Data":"3e4d0dbb012b759ceacb498f1497e71ce18858a4f32749b28452de2cfc5d7c57"} Mar 20 15:48:04 crc kubenswrapper[4764]: I0320 15:48:04.557919 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567028-v4vr7" Mar 20 15:48:04 crc kubenswrapper[4764]: I0320 15:48:04.585277 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk9m6\" (UniqueName: \"kubernetes.io/projected/7754fa43-1f4e-413a-be02-fe767c0eeb75-kube-api-access-bk9m6\") pod \"7754fa43-1f4e-413a-be02-fe767c0eeb75\" (UID: \"7754fa43-1f4e-413a-be02-fe767c0eeb75\") " Mar 20 15:48:04 crc kubenswrapper[4764]: I0320 15:48:04.606647 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7754fa43-1f4e-413a-be02-fe767c0eeb75-kube-api-access-bk9m6" (OuterVolumeSpecName: "kube-api-access-bk9m6") pod "7754fa43-1f4e-413a-be02-fe767c0eeb75" (UID: "7754fa43-1f4e-413a-be02-fe767c0eeb75"). InnerVolumeSpecName "kube-api-access-bk9m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:04 crc kubenswrapper[4764]: I0320 15:48:04.687890 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk9m6\" (UniqueName: \"kubernetes.io/projected/7754fa43-1f4e-413a-be02-fe767c0eeb75-kube-api-access-bk9m6\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:05 crc kubenswrapper[4764]: I0320 15:48:05.046343 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567028-v4vr7" event={"ID":"7754fa43-1f4e-413a-be02-fe767c0eeb75","Type":"ContainerDied","Data":"a010a100564b1171cab200998a0e6daf1d3cf1783a8204ecc24589263e7eb893"} Mar 20 15:48:05 crc kubenswrapper[4764]: I0320 15:48:05.046441 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a010a100564b1171cab200998a0e6daf1d3cf1783a8204ecc24589263e7eb893" Mar 20 15:48:05 crc kubenswrapper[4764]: I0320 15:48:05.046366 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567028-v4vr7" Mar 20 15:48:05 crc kubenswrapper[4764]: I0320 15:48:05.623233 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567022-9b9mf"] Mar 20 15:48:05 crc kubenswrapper[4764]: I0320 15:48:05.631048 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567022-9b9mf"] Mar 20 15:48:07 crc kubenswrapper[4764]: I0320 15:48:07.146709 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91f6364c-5c64-4314-81c2-3477d10a069a" path="/var/lib/kubelet/pods/91f6364c-5c64-4314-81c2-3477d10a069a/volumes" Mar 20 15:48:08 crc kubenswrapper[4764]: I0320 15:48:08.443205 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:48:08 crc kubenswrapper[4764]: I0320 15:48:08.443269 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:48:11 crc kubenswrapper[4764]: I0320 15:48:11.031355 4764 scope.go:117] "RemoveContainer" containerID="d971b1c7671a18c61e2a60af840d12c1b5e75d283f3bf012087d89e76e319f85" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.141041 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w5ppw"] Mar 20 15:48:32 crc kubenswrapper[4764]: E0320 15:48:32.141830 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7754fa43-1f4e-413a-be02-fe767c0eeb75" containerName="oc" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.141904 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7754fa43-1f4e-413a-be02-fe767c0eeb75" containerName="oc" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.142102 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7754fa43-1f4e-413a-be02-fe767c0eeb75" containerName="oc" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.143332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.186842 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5ppw"] Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.258843 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-utilities\") pod \"redhat-operators-w5ppw\" (UID: \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\") " pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.259109 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p49zc\" (UniqueName: \"kubernetes.io/projected/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-kube-api-access-p49zc\") pod \"redhat-operators-w5ppw\" (UID: \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\") " pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.259426 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-catalog-content\") pod \"redhat-operators-w5ppw\" (UID: \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\") " pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.361139 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-catalog-content\") pod \"redhat-operators-w5ppw\" (UID: \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\") " pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.361829 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-utilities\") pod \"redhat-operators-w5ppw\" (UID: \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\") " pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.362188 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p49zc\" (UniqueName: \"kubernetes.io/projected/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-kube-api-access-p49zc\") pod \"redhat-operators-w5ppw\" (UID: \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\") " pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.362106 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-utilities\") pod \"redhat-operators-w5ppw\" (UID: \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\") " pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.361744 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-catalog-content\") pod \"redhat-operators-w5ppw\" (UID: \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\") " pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.395107 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p49zc\" (UniqueName: \"kubernetes.io/projected/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-kube-api-access-p49zc\") pod \"redhat-operators-w5ppw\" (UID: \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\") " pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.482827 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:32 crc kubenswrapper[4764]: I0320 15:48:32.939120 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5ppw"] Mar 20 15:48:33 crc kubenswrapper[4764]: I0320 15:48:33.286465 4764 generic.go:334] "Generic (PLEG): container finished" podID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" containerID="6c258aef5f9855d65958162afb331a376fa1a9425bf175f58de91638ed8b1f02" exitCode=0 Mar 20 15:48:33 crc kubenswrapper[4764]: I0320 15:48:33.286744 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5ppw" event={"ID":"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8","Type":"ContainerDied","Data":"6c258aef5f9855d65958162afb331a376fa1a9425bf175f58de91638ed8b1f02"} Mar 20 15:48:33 crc kubenswrapper[4764]: I0320 15:48:33.286827 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5ppw" event={"ID":"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8","Type":"ContainerStarted","Data":"5ebcbcf92bfb4ffce33464260d6b0bbb918b30b715596d7b39bb72932cdbb5bf"} Mar 20 15:48:35 crc kubenswrapper[4764]: I0320 15:48:35.301757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5ppw" event={"ID":"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8","Type":"ContainerStarted","Data":"408b78d776fcdaca696173b5f6d5c6274b69bd93c220fa796cc2f662b6c19dcb"} Mar 20 15:48:38 crc kubenswrapper[4764]: I0320 15:48:38.443893 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:48:38 crc kubenswrapper[4764]: I0320 15:48:38.444674 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:48:40 crc kubenswrapper[4764]: I0320 15:48:40.351188 4764 generic.go:334] "Generic (PLEG): container finished" podID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" containerID="408b78d776fcdaca696173b5f6d5c6274b69bd93c220fa796cc2f662b6c19dcb" exitCode=0 Mar 20 15:48:40 crc kubenswrapper[4764]: I0320 15:48:40.351243 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5ppw" event={"ID":"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8","Type":"ContainerDied","Data":"408b78d776fcdaca696173b5f6d5c6274b69bd93c220fa796cc2f662b6c19dcb"} Mar 20 15:48:41 crc kubenswrapper[4764]: I0320 15:48:41.364253 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5ppw" event={"ID":"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8","Type":"ContainerStarted","Data":"1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1"} Mar 20 15:48:41 crc kubenswrapper[4764]: I0320 15:48:41.385406 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w5ppw" podStartSLOduration=1.853144227 podStartE2EDuration="9.38539007s" podCreationTimestamp="2026-03-20 15:48:32 +0000 UTC" firstStartedPulling="2026-03-20 15:48:33.288178725 +0000 UTC m=+3434.904367844" lastFinishedPulling="2026-03-20 15:48:40.820424558 +0000 UTC m=+3442.436613687" observedRunningTime="2026-03-20 15:48:41.379062604 +0000 UTC m=+3442.995251733" watchObservedRunningTime="2026-03-20 15:48:41.38539007 +0000 UTC m=+3443.001579189" Mar 20 15:48:42 crc kubenswrapper[4764]: I0320 15:48:42.483118 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:42 crc kubenswrapper[4764]: I0320 15:48:42.484083 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:43 crc kubenswrapper[4764]: I0320 15:48:43.552672 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w5ppw" podUID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" containerName="registry-server" probeResult="failure" output=< Mar 20 15:48:43 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Mar 20 15:48:43 crc kubenswrapper[4764]: > Mar 20 15:48:52 crc kubenswrapper[4764]: I0320 15:48:52.556342 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:52 crc kubenswrapper[4764]: I0320 15:48:52.616671 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:52 crc kubenswrapper[4764]: I0320 15:48:52.801528 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5ppw"] Mar 20 15:48:54 crc kubenswrapper[4764]: I0320 15:48:54.486083 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w5ppw" podUID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" containerName="registry-server" containerID="cri-o://1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1" gracePeriod=2 Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.249233 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.327133 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-utilities\") pod \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\" (UID: \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\") " Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.327190 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-catalog-content\") pod \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\" (UID: \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\") " Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.327225 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p49zc\" (UniqueName: \"kubernetes.io/projected/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-kube-api-access-p49zc\") pod \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\" (UID: \"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8\") " Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.329052 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-utilities" (OuterVolumeSpecName: "utilities") pod "fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" (UID: "fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.333568 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-kube-api-access-p49zc" (OuterVolumeSpecName: "kube-api-access-p49zc") pod "fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" (UID: "fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8"). InnerVolumeSpecName "kube-api-access-p49zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.429555 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.429598 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p49zc\" (UniqueName: \"kubernetes.io/projected/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-kube-api-access-p49zc\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.450615 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" (UID: "fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.495862 4764 generic.go:334] "Generic (PLEG): container finished" podID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" containerID="1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1" exitCode=0 Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.495946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5ppw" event={"ID":"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8","Type":"ContainerDied","Data":"1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1"} Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.495970 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5ppw" event={"ID":"fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8","Type":"ContainerDied","Data":"5ebcbcf92bfb4ffce33464260d6b0bbb918b30b715596d7b39bb72932cdbb5bf"} Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.495988 4764 scope.go:117] "RemoveContainer" containerID="1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.496294 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5ppw" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.518580 4764 scope.go:117] "RemoveContainer" containerID="408b78d776fcdaca696173b5f6d5c6274b69bd93c220fa796cc2f662b6c19dcb" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.530866 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.554415 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5ppw"] Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.561118 4764 scope.go:117] "RemoveContainer" containerID="6c258aef5f9855d65958162afb331a376fa1a9425bf175f58de91638ed8b1f02" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.564412 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w5ppw"] Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.624968 4764 scope.go:117] "RemoveContainer" containerID="1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1" Mar 20 15:48:55 crc kubenswrapper[4764]: E0320 15:48:55.625518 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1\": container with ID starting with 1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1 not found: ID does not exist" containerID="1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.625568 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1"} err="failed to get container status \"1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1\": rpc error: code = NotFound desc = could not find container \"1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1\": container with ID starting with 1120383e8c2c3d483761c47466a1da9f9c4af3cc4b3d5ab1f58d0fa4928d0dc1 not found: ID does not exist" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.625599 4764 scope.go:117] "RemoveContainer" containerID="408b78d776fcdaca696173b5f6d5c6274b69bd93c220fa796cc2f662b6c19dcb" Mar 20 15:48:55 crc kubenswrapper[4764]: E0320 15:48:55.626052 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408b78d776fcdaca696173b5f6d5c6274b69bd93c220fa796cc2f662b6c19dcb\": container with ID starting with 408b78d776fcdaca696173b5f6d5c6274b69bd93c220fa796cc2f662b6c19dcb not found: ID does not exist" containerID="408b78d776fcdaca696173b5f6d5c6274b69bd93c220fa796cc2f662b6c19dcb" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.626087 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408b78d776fcdaca696173b5f6d5c6274b69bd93c220fa796cc2f662b6c19dcb"} err="failed to get container status \"408b78d776fcdaca696173b5f6d5c6274b69bd93c220fa796cc2f662b6c19dcb\": rpc error: code = NotFound desc = could not find container \"408b78d776fcdaca696173b5f6d5c6274b69bd93c220fa796cc2f662b6c19dcb\": container with ID starting with 408b78d776fcdaca696173b5f6d5c6274b69bd93c220fa796cc2f662b6c19dcb not found: ID does not exist" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.626107 4764 scope.go:117] "RemoveContainer" containerID="6c258aef5f9855d65958162afb331a376fa1a9425bf175f58de91638ed8b1f02" Mar 20 15:48:55 crc kubenswrapper[4764]: E0320 15:48:55.626520 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c258aef5f9855d65958162afb331a376fa1a9425bf175f58de91638ed8b1f02\": container with ID starting with 6c258aef5f9855d65958162afb331a376fa1a9425bf175f58de91638ed8b1f02 not found: ID does not exist" containerID="6c258aef5f9855d65958162afb331a376fa1a9425bf175f58de91638ed8b1f02" Mar 20 15:48:55 crc kubenswrapper[4764]: I0320 15:48:55.626557 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c258aef5f9855d65958162afb331a376fa1a9425bf175f58de91638ed8b1f02"} err="failed to get container status \"6c258aef5f9855d65958162afb331a376fa1a9425bf175f58de91638ed8b1f02\": rpc error: code = NotFound desc = could not find container \"6c258aef5f9855d65958162afb331a376fa1a9425bf175f58de91638ed8b1f02\": container with ID starting with 6c258aef5f9855d65958162afb331a376fa1a9425bf175f58de91638ed8b1f02 not found: ID does not exist" Mar 20 15:48:57 crc kubenswrapper[4764]: I0320 15:48:57.146190 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" path="/var/lib/kubelet/pods/fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8/volumes" Mar 20 15:49:08 crc kubenswrapper[4764]: I0320 15:49:08.443299 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:49:08 crc kubenswrapper[4764]: I0320 15:49:08.444175 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:49:08 crc kubenswrapper[4764]: I0320 15:49:08.444243 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 15:49:08 crc kubenswrapper[4764]: I0320 15:49:08.445537 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10384472b77ab8f5ced352ce2637ccaa4865d4dbf17c39394007213676c97e63"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:49:08 crc kubenswrapper[4764]: I0320 15:49:08.445631 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://10384472b77ab8f5ced352ce2637ccaa4865d4dbf17c39394007213676c97e63" gracePeriod=600 Mar 20 15:49:08 crc kubenswrapper[4764]: I0320 15:49:08.612786 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="10384472b77ab8f5ced352ce2637ccaa4865d4dbf17c39394007213676c97e63" exitCode=0 Mar 20 15:49:08 crc kubenswrapper[4764]: I0320 15:49:08.612887 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"10384472b77ab8f5ced352ce2637ccaa4865d4dbf17c39394007213676c97e63"} Mar 20 15:49:08 crc kubenswrapper[4764]: I0320 15:49:08.614426 4764 scope.go:117] "RemoveContainer" containerID="d744b1e9ce4d40f973ea0b35576548e0d3d66f35d4f3ec070aef565d09aca3f3" Mar 20 15:49:09 crc kubenswrapper[4764]: I0320 15:49:09.622862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3"} Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.144253 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567030-5xjss"] Mar 20 15:50:00 crc kubenswrapper[4764]: E0320 15:50:00.145057 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" containerName="registry-server" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.145069 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" containerName="registry-server" Mar 20 15:50:00 crc kubenswrapper[4764]: E0320 15:50:00.145081 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" containerName="extract-utilities" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.145088 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" containerName="extract-utilities" Mar 20 15:50:00 crc kubenswrapper[4764]: E0320 15:50:00.145116 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" containerName="extract-content" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.145122 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" containerName="extract-content" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.145299 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb4ecd6-9d45-4ff3-b9f8-d733c98c8fa8" containerName="registry-server" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.145874 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567030-5xjss" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.148106 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.149147 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.149155 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.165808 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567030-5xjss"] Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.297445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkbnf\" (UniqueName: \"kubernetes.io/projected/550adf3b-64b2-49b3-8e67-90eb76630aec-kube-api-access-dkbnf\") pod \"auto-csr-approver-29567030-5xjss\" (UID: \"550adf3b-64b2-49b3-8e67-90eb76630aec\") " pod="openshift-infra/auto-csr-approver-29567030-5xjss" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.399884 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkbnf\" (UniqueName: \"kubernetes.io/projected/550adf3b-64b2-49b3-8e67-90eb76630aec-kube-api-access-dkbnf\") pod \"auto-csr-approver-29567030-5xjss\" (UID: \"550adf3b-64b2-49b3-8e67-90eb76630aec\") " pod="openshift-infra/auto-csr-approver-29567030-5xjss" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.433348 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkbnf\" (UniqueName: \"kubernetes.io/projected/550adf3b-64b2-49b3-8e67-90eb76630aec-kube-api-access-dkbnf\") pod \"auto-csr-approver-29567030-5xjss\" (UID: \"550adf3b-64b2-49b3-8e67-90eb76630aec\") " pod="openshift-infra/auto-csr-approver-29567030-5xjss" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.475423 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567030-5xjss" Mar 20 15:50:00 crc kubenswrapper[4764]: I0320 15:50:00.915954 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567030-5xjss"] Mar 20 15:50:01 crc kubenswrapper[4764]: I0320 15:50:01.145921 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567030-5xjss" event={"ID":"550adf3b-64b2-49b3-8e67-90eb76630aec","Type":"ContainerStarted","Data":"4eadb66da92f7400c59aa18196df314522dbed1c1dc2f4e6361fb68a166b4df2"} Mar 20 15:50:03 crc kubenswrapper[4764]: I0320 15:50:03.157011 4764 generic.go:334] "Generic (PLEG): container finished" podID="550adf3b-64b2-49b3-8e67-90eb76630aec" containerID="149134fe9fb329177fa80f53e202278d3eacf06969c5c96deec9799d601dc542" exitCode=0 Mar 20 15:50:03 crc kubenswrapper[4764]: I0320 15:50:03.157106 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567030-5xjss" event={"ID":"550adf3b-64b2-49b3-8e67-90eb76630aec","Type":"ContainerDied","Data":"149134fe9fb329177fa80f53e202278d3eacf06969c5c96deec9799d601dc542"} Mar 20 15:50:04 crc kubenswrapper[4764]: I0320 15:50:04.706837 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567030-5xjss" Mar 20 15:50:04 crc kubenswrapper[4764]: I0320 15:50:04.885932 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkbnf\" (UniqueName: \"kubernetes.io/projected/550adf3b-64b2-49b3-8e67-90eb76630aec-kube-api-access-dkbnf\") pod \"550adf3b-64b2-49b3-8e67-90eb76630aec\" (UID: \"550adf3b-64b2-49b3-8e67-90eb76630aec\") " Mar 20 15:50:04 crc kubenswrapper[4764]: I0320 15:50:04.894842 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550adf3b-64b2-49b3-8e67-90eb76630aec-kube-api-access-dkbnf" (OuterVolumeSpecName: "kube-api-access-dkbnf") pod "550adf3b-64b2-49b3-8e67-90eb76630aec" (UID: "550adf3b-64b2-49b3-8e67-90eb76630aec"). InnerVolumeSpecName "kube-api-access-dkbnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:50:04 crc kubenswrapper[4764]: I0320 15:50:04.989005 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkbnf\" (UniqueName: \"kubernetes.io/projected/550adf3b-64b2-49b3-8e67-90eb76630aec-kube-api-access-dkbnf\") on node \"crc\" DevicePath \"\"" Mar 20 15:50:05 crc kubenswrapper[4764]: I0320 15:50:05.190914 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567030-5xjss" event={"ID":"550adf3b-64b2-49b3-8e67-90eb76630aec","Type":"ContainerDied","Data":"4eadb66da92f7400c59aa18196df314522dbed1c1dc2f4e6361fb68a166b4df2"} Mar 20 15:50:05 crc kubenswrapper[4764]: I0320 15:50:05.190950 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eadb66da92f7400c59aa18196df314522dbed1c1dc2f4e6361fb68a166b4df2" Mar 20 15:50:05 crc kubenswrapper[4764]: I0320 15:50:05.191006 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567030-5xjss" Mar 20 15:50:05 crc kubenswrapper[4764]: I0320 15:50:05.783358 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567024-pdf5w"] Mar 20 15:50:05 crc kubenswrapper[4764]: I0320 15:50:05.790541 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567024-pdf5w"] Mar 20 15:50:07 crc kubenswrapper[4764]: I0320 15:50:07.138096 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a" path="/var/lib/kubelet/pods/0006f8ef-cf16-4749-aafb-b2c7ebbfdd4a/volumes" Mar 20 15:50:11 crc kubenswrapper[4764]: I0320 15:50:11.168725 4764 scope.go:117] "RemoveContainer" containerID="d3fde9228a7d2b3589b8c5ec24a4cb33697f8eda68053f8a2c8f2993ca918522" Mar 20 15:51:08 crc kubenswrapper[4764]: I0320 15:51:08.443640 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:51:08 crc kubenswrapper[4764]: I0320 15:51:08.444322 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:51:38 crc kubenswrapper[4764]: I0320 15:51:38.443745 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:51:38 crc kubenswrapper[4764]: I0320 15:51:38.444669 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.144150 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567032-lvfpn"] Mar 20 15:52:00 crc kubenswrapper[4764]: E0320 15:52:00.145249 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550adf3b-64b2-49b3-8e67-90eb76630aec" containerName="oc" Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.145264 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="550adf3b-64b2-49b3-8e67-90eb76630aec" containerName="oc" Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.145530 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="550adf3b-64b2-49b3-8e67-90eb76630aec" containerName="oc" Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.146328 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-lvfpn" Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.148748 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.153546 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.153724 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.158337 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567032-lvfpn"] Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.201035 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2rwf\" (UniqueName: \"kubernetes.io/projected/2399033e-a5ae-4d16-88bb-c5b019f65bde-kube-api-access-w2rwf\") pod \"auto-csr-approver-29567032-lvfpn\" (UID: \"2399033e-a5ae-4d16-88bb-c5b019f65bde\") " pod="openshift-infra/auto-csr-approver-29567032-lvfpn" Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.304542 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2rwf\" (UniqueName: \"kubernetes.io/projected/2399033e-a5ae-4d16-88bb-c5b019f65bde-kube-api-access-w2rwf\") pod \"auto-csr-approver-29567032-lvfpn\" (UID: \"2399033e-a5ae-4d16-88bb-c5b019f65bde\") " pod="openshift-infra/auto-csr-approver-29567032-lvfpn" Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.328575 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2rwf\" (UniqueName: \"kubernetes.io/projected/2399033e-a5ae-4d16-88bb-c5b019f65bde-kube-api-access-w2rwf\") pod \"auto-csr-approver-29567032-lvfpn\" (UID: \"2399033e-a5ae-4d16-88bb-c5b019f65bde\") " pod="openshift-infra/auto-csr-approver-29567032-lvfpn" Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.493761 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-lvfpn" Mar 20 15:52:00 crc kubenswrapper[4764]: I0320 15:52:00.998254 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567032-lvfpn"] Mar 20 15:52:01 crc kubenswrapper[4764]: I0320 15:52:01.318658 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567032-lvfpn" event={"ID":"2399033e-a5ae-4d16-88bb-c5b019f65bde","Type":"ContainerStarted","Data":"f3bcd2f74a53414c1cc6ee498e60cca61856bba83f3aa9e85b965ce7daff08a3"} Mar 20 15:52:02 crc kubenswrapper[4764]: I0320 15:52:02.327259 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567032-lvfpn" event={"ID":"2399033e-a5ae-4d16-88bb-c5b019f65bde","Type":"ContainerStarted","Data":"00b6d62499524adfd202ae445e63f8cfbb61968064af9759362b2f7c61c6fd79"} Mar 20 15:52:02 crc kubenswrapper[4764]: I0320 15:52:02.356565 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567032-lvfpn" podStartSLOduration=1.4682243879999999 podStartE2EDuration="2.356541359s" podCreationTimestamp="2026-03-20 15:52:00 +0000 UTC" firstStartedPulling="2026-03-20 15:52:01.003587985 +0000 UTC m=+3642.619777114" lastFinishedPulling="2026-03-20 15:52:01.891904946 +0000 UTC m=+3643.508094085" observedRunningTime="2026-03-20 15:52:02.345519598 +0000 UTC m=+3643.961708767" watchObservedRunningTime="2026-03-20 15:52:02.356541359 +0000 UTC m=+3643.972730498" Mar 20 15:52:03 crc kubenswrapper[4764]: I0320 15:52:03.340747 4764 generic.go:334] "Generic (PLEG): container finished" podID="2399033e-a5ae-4d16-88bb-c5b019f65bde" containerID="00b6d62499524adfd202ae445e63f8cfbb61968064af9759362b2f7c61c6fd79" exitCode=0 Mar 20 15:52:03 crc kubenswrapper[4764]: I0320 15:52:03.340826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567032-lvfpn" event={"ID":"2399033e-a5ae-4d16-88bb-c5b019f65bde","Type":"ContainerDied","Data":"00b6d62499524adfd202ae445e63f8cfbb61968064af9759362b2f7c61c6fd79"} Mar 20 15:52:04 crc kubenswrapper[4764]: I0320 15:52:04.760420 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-lvfpn" Mar 20 15:52:04 crc kubenswrapper[4764]: I0320 15:52:04.803008 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2rwf\" (UniqueName: \"kubernetes.io/projected/2399033e-a5ae-4d16-88bb-c5b019f65bde-kube-api-access-w2rwf\") pod \"2399033e-a5ae-4d16-88bb-c5b019f65bde\" (UID: \"2399033e-a5ae-4d16-88bb-c5b019f65bde\") " Mar 20 15:52:04 crc kubenswrapper[4764]: I0320 15:52:04.810302 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2399033e-a5ae-4d16-88bb-c5b019f65bde-kube-api-access-w2rwf" (OuterVolumeSpecName: "kube-api-access-w2rwf") pod "2399033e-a5ae-4d16-88bb-c5b019f65bde" (UID: "2399033e-a5ae-4d16-88bb-c5b019f65bde"). InnerVolumeSpecName "kube-api-access-w2rwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:52:04 crc kubenswrapper[4764]: I0320 15:52:04.905105 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2rwf\" (UniqueName: \"kubernetes.io/projected/2399033e-a5ae-4d16-88bb-c5b019f65bde-kube-api-access-w2rwf\") on node \"crc\" DevicePath \"\"" Mar 20 15:52:05 crc kubenswrapper[4764]: I0320 15:52:05.360054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567032-lvfpn" event={"ID":"2399033e-a5ae-4d16-88bb-c5b019f65bde","Type":"ContainerDied","Data":"f3bcd2f74a53414c1cc6ee498e60cca61856bba83f3aa9e85b965ce7daff08a3"} Mar 20 15:52:05 crc kubenswrapper[4764]: I0320 15:52:05.360351 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3bcd2f74a53414c1cc6ee498e60cca61856bba83f3aa9e85b965ce7daff08a3" Mar 20 15:52:05 crc kubenswrapper[4764]: I0320 15:52:05.360113 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567032-lvfpn" Mar 20 15:52:05 crc kubenswrapper[4764]: I0320 15:52:05.409403 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567026-qvcm6"] Mar 20 15:52:05 crc kubenswrapper[4764]: I0320 15:52:05.418123 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567026-qvcm6"] Mar 20 15:52:07 crc kubenswrapper[4764]: I0320 15:52:07.141455 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e" path="/var/lib/kubelet/pods/0765a61d-7bd5-4e5d-bf14-d359fc3c7a5e/volumes" Mar 20 15:52:08 crc kubenswrapper[4764]: I0320 15:52:08.443730 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:52:08 crc kubenswrapper[4764]: I0320 15:52:08.444761 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:52:08 crc kubenswrapper[4764]: I0320 15:52:08.444904 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 15:52:08 crc kubenswrapper[4764]: I0320 15:52:08.445876 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 15:52:08 crc kubenswrapper[4764]: I0320 15:52:08.446027 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" gracePeriod=600 Mar 20 15:52:08 crc kubenswrapper[4764]: E0320 15:52:08.569915 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:52:09 crc kubenswrapper[4764]: I0320 15:52:09.402676 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" exitCode=0 Mar 20 15:52:09 crc kubenswrapper[4764]: I0320 15:52:09.402755 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3"} Mar 20 15:52:09 crc kubenswrapper[4764]: I0320 15:52:09.403195 4764 scope.go:117] "RemoveContainer" containerID="10384472b77ab8f5ced352ce2637ccaa4865d4dbf17c39394007213676c97e63" Mar 20 15:52:09 crc kubenswrapper[4764]: I0320 15:52:09.404329 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:52:09 crc kubenswrapper[4764]: E0320 15:52:09.404790 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:52:11 crc kubenswrapper[4764]: I0320 15:52:11.258510 4764 scope.go:117] "RemoveContainer" containerID="e05561671954527ad8408ec61f1ddd51537d0325b5c28512d8a11073a03c118d" Mar 20 15:52:22 crc kubenswrapper[4764]: I0320 15:52:22.126615 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:52:22 crc kubenswrapper[4764]: E0320 15:52:22.127296 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:52:35 crc kubenswrapper[4764]: I0320 15:52:35.126401 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:52:35 crc kubenswrapper[4764]: E0320 15:52:35.127017 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:52:46 crc kubenswrapper[4764]: I0320 15:52:46.128063 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:52:46 crc kubenswrapper[4764]: E0320 15:52:46.128646 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:53:01 crc kubenswrapper[4764]: I0320 15:53:01.130255 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:53:01 crc kubenswrapper[4764]: E0320 15:53:01.130961 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.551972 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-87qg6"] Mar 20 15:53:11 crc kubenswrapper[4764]: E0320 15:53:11.552876 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2399033e-a5ae-4d16-88bb-c5b019f65bde" containerName="oc" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.552895 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2399033e-a5ae-4d16-88bb-c5b019f65bde" containerName="oc" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.553127 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2399033e-a5ae-4d16-88bb-c5b019f65bde" containerName="oc" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.554859 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.568680 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-87qg6"] Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.685610 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b7a677-a6ec-4e79-8145-88a5b08158ee-utilities\") pod \"certified-operators-87qg6\" (UID: \"21b7a677-a6ec-4e79-8145-88a5b08158ee\") " pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.685684 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b7a677-a6ec-4e79-8145-88a5b08158ee-catalog-content\") pod \"certified-operators-87qg6\" (UID: \"21b7a677-a6ec-4e79-8145-88a5b08158ee\") " pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.686198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zk6n\" (UniqueName: \"kubernetes.io/projected/21b7a677-a6ec-4e79-8145-88a5b08158ee-kube-api-access-5zk6n\") pod \"certified-operators-87qg6\" (UID: \"21b7a677-a6ec-4e79-8145-88a5b08158ee\") " pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.788516 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zk6n\" (UniqueName: \"kubernetes.io/projected/21b7a677-a6ec-4e79-8145-88a5b08158ee-kube-api-access-5zk6n\") pod \"certified-operators-87qg6\" (UID: \"21b7a677-a6ec-4e79-8145-88a5b08158ee\") " pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.788651 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b7a677-a6ec-4e79-8145-88a5b08158ee-utilities\") pod \"certified-operators-87qg6\" (UID: \"21b7a677-a6ec-4e79-8145-88a5b08158ee\") " pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.788690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b7a677-a6ec-4e79-8145-88a5b08158ee-catalog-content\") pod \"certified-operators-87qg6\" (UID: \"21b7a677-a6ec-4e79-8145-88a5b08158ee\") " pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.789402 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b7a677-a6ec-4e79-8145-88a5b08158ee-catalog-content\") pod \"certified-operators-87qg6\" (UID: \"21b7a677-a6ec-4e79-8145-88a5b08158ee\") " pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.789464 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b7a677-a6ec-4e79-8145-88a5b08158ee-utilities\") pod \"certified-operators-87qg6\" (UID: \"21b7a677-a6ec-4e79-8145-88a5b08158ee\") " pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.810606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zk6n\" (UniqueName: \"kubernetes.io/projected/21b7a677-a6ec-4e79-8145-88a5b08158ee-kube-api-access-5zk6n\") pod \"certified-operators-87qg6\" (UID: \"21b7a677-a6ec-4e79-8145-88a5b08158ee\") " pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:11 crc kubenswrapper[4764]: I0320 15:53:11.874549 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:12 crc kubenswrapper[4764]: I0320 15:53:12.126994 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:53:12 crc kubenswrapper[4764]: E0320 15:53:12.127593 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:53:12 crc kubenswrapper[4764]: I0320 15:53:12.392618 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-87qg6"] Mar 20 15:53:13 crc kubenswrapper[4764]: I0320 15:53:13.082061 4764 generic.go:334] "Generic (PLEG): container finished" podID="21b7a677-a6ec-4e79-8145-88a5b08158ee" containerID="efdb1c50354bd7290a47b09d50126eb2c93cc916abf31924656b5c9948a3705c" exitCode=0 Mar 20 15:53:13 crc kubenswrapper[4764]: I0320 15:53:13.082150 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87qg6" event={"ID":"21b7a677-a6ec-4e79-8145-88a5b08158ee","Type":"ContainerDied","Data":"efdb1c50354bd7290a47b09d50126eb2c93cc916abf31924656b5c9948a3705c"} Mar 20 15:53:13 crc kubenswrapper[4764]: I0320 15:53:13.082427 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87qg6" event={"ID":"21b7a677-a6ec-4e79-8145-88a5b08158ee","Type":"ContainerStarted","Data":"ef0d92c2f77a816456c2639c3e1a7edb272905c7c9ce5cbd778c6d9880e9607e"} Mar 20 15:53:13 crc kubenswrapper[4764]: I0320 15:53:13.084951 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:53:14 crc kubenswrapper[4764]: I0320 15:53:14.094564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87qg6" event={"ID":"21b7a677-a6ec-4e79-8145-88a5b08158ee","Type":"ContainerStarted","Data":"0c5277db9bb9ec4df7899fcf168bdffb3054ddd34c7a431b7b5daed21394a0bd"} Mar 20 15:53:16 crc kubenswrapper[4764]: I0320 15:53:16.119490 4764 generic.go:334] "Generic (PLEG): container finished" podID="21b7a677-a6ec-4e79-8145-88a5b08158ee" containerID="0c5277db9bb9ec4df7899fcf168bdffb3054ddd34c7a431b7b5daed21394a0bd" exitCode=0 Mar 20 15:53:16 crc kubenswrapper[4764]: I0320 15:53:16.119612 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87qg6" event={"ID":"21b7a677-a6ec-4e79-8145-88a5b08158ee","Type":"ContainerDied","Data":"0c5277db9bb9ec4df7899fcf168bdffb3054ddd34c7a431b7b5daed21394a0bd"} Mar 20 15:53:17 crc kubenswrapper[4764]: I0320 15:53:17.141104 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87qg6" event={"ID":"21b7a677-a6ec-4e79-8145-88a5b08158ee","Type":"ContainerStarted","Data":"d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa"} Mar 20 15:53:17 crc kubenswrapper[4764]: I0320 15:53:17.165702 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-87qg6" podStartSLOduration=2.6837224969999998 podStartE2EDuration="6.165683576s" podCreationTimestamp="2026-03-20 15:53:11 +0000 UTC" firstStartedPulling="2026-03-20 15:53:13.084624791 +0000 UTC m=+3714.700813930" lastFinishedPulling="2026-03-20 15:53:16.56658581 +0000 UTC m=+3718.182775009" observedRunningTime="2026-03-20 15:53:17.161236629 +0000 UTC m=+3718.777425788" watchObservedRunningTime="2026-03-20 15:53:17.165683576 +0000 UTC m=+3718.781872715" Mar 20 15:53:21 crc kubenswrapper[4764]: I0320 15:53:21.874643 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:21 crc kubenswrapper[4764]: I0320 15:53:21.875070 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:21 crc kubenswrapper[4764]: I0320 15:53:21.939070 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:22 crc kubenswrapper[4764]: I0320 15:53:22.250883 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:22 crc kubenswrapper[4764]: I0320 15:53:22.307683 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-87qg6"] Mar 20 15:53:24 crc kubenswrapper[4764]: I0320 15:53:24.194856 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-87qg6" podUID="21b7a677-a6ec-4e79-8145-88a5b08158ee" containerName="registry-server" containerID="cri-o://d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa" gracePeriod=2 Mar 20 15:53:24 crc kubenswrapper[4764]: I0320 15:53:24.783304 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:24 crc kubenswrapper[4764]: I0320 15:53:24.983639 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zk6n\" (UniqueName: \"kubernetes.io/projected/21b7a677-a6ec-4e79-8145-88a5b08158ee-kube-api-access-5zk6n\") pod \"21b7a677-a6ec-4e79-8145-88a5b08158ee\" (UID: \"21b7a677-a6ec-4e79-8145-88a5b08158ee\") " Mar 20 15:53:24 crc kubenswrapper[4764]: I0320 15:53:24.983744 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b7a677-a6ec-4e79-8145-88a5b08158ee-catalog-content\") pod \"21b7a677-a6ec-4e79-8145-88a5b08158ee\" (UID: \"21b7a677-a6ec-4e79-8145-88a5b08158ee\") " Mar 20 15:53:24 crc kubenswrapper[4764]: I0320 15:53:24.983998 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b7a677-a6ec-4e79-8145-88a5b08158ee-utilities\") pod \"21b7a677-a6ec-4e79-8145-88a5b08158ee\" (UID: \"21b7a677-a6ec-4e79-8145-88a5b08158ee\") " Mar 20 15:53:24 crc kubenswrapper[4764]: I0320 15:53:24.984878 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21b7a677-a6ec-4e79-8145-88a5b08158ee-utilities" (OuterVolumeSpecName: "utilities") pod "21b7a677-a6ec-4e79-8145-88a5b08158ee" (UID: "21b7a677-a6ec-4e79-8145-88a5b08158ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:53:24 crc kubenswrapper[4764]: I0320 15:53:24.991340 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b7a677-a6ec-4e79-8145-88a5b08158ee-kube-api-access-5zk6n" (OuterVolumeSpecName: "kube-api-access-5zk6n") pod "21b7a677-a6ec-4e79-8145-88a5b08158ee" (UID: "21b7a677-a6ec-4e79-8145-88a5b08158ee"). InnerVolumeSpecName "kube-api-access-5zk6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.035917 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21b7a677-a6ec-4e79-8145-88a5b08158ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21b7a677-a6ec-4e79-8145-88a5b08158ee" (UID: "21b7a677-a6ec-4e79-8145-88a5b08158ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.086213 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21b7a677-a6ec-4e79-8145-88a5b08158ee-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.086261 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zk6n\" (UniqueName: \"kubernetes.io/projected/21b7a677-a6ec-4e79-8145-88a5b08158ee-kube-api-access-5zk6n\") on node \"crc\" DevicePath \"\"" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.086272 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21b7a677-a6ec-4e79-8145-88a5b08158ee-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.207614 4764 generic.go:334] "Generic (PLEG): container finished" podID="21b7a677-a6ec-4e79-8145-88a5b08158ee" containerID="d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa" exitCode=0 Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.207662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87qg6" event={"ID":"21b7a677-a6ec-4e79-8145-88a5b08158ee","Type":"ContainerDied","Data":"d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa"} Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.207773 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-87qg6" event={"ID":"21b7a677-a6ec-4e79-8145-88a5b08158ee","Type":"ContainerDied","Data":"ef0d92c2f77a816456c2639c3e1a7edb272905c7c9ce5cbd778c6d9880e9607e"} Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.207703 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-87qg6" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.207811 4764 scope.go:117] "RemoveContainer" containerID="d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.234350 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-87qg6"] Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.246228 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-87qg6"] Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.257710 4764 scope.go:117] "RemoveContainer" containerID="0c5277db9bb9ec4df7899fcf168bdffb3054ddd34c7a431b7b5daed21394a0bd" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.286509 4764 scope.go:117] "RemoveContainer" containerID="efdb1c50354bd7290a47b09d50126eb2c93cc916abf31924656b5c9948a3705c" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.325432 4764 scope.go:117] "RemoveContainer" containerID="d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa" Mar 20 15:53:25 crc kubenswrapper[4764]: E0320 15:53:25.325842 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa\": container with ID starting with d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa not found: ID does not exist" containerID="d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.325875 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa"} err="failed to get container status \"d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa\": rpc error: code = NotFound desc = could not find container \"d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa\": container with ID starting with d966717d145de7d60457e76b8b23657fea08e26aa6d698d9ab7967a3834551aa not found: ID does not exist" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.325896 4764 scope.go:117] "RemoveContainer" containerID="0c5277db9bb9ec4df7899fcf168bdffb3054ddd34c7a431b7b5daed21394a0bd" Mar 20 15:53:25 crc kubenswrapper[4764]: E0320 15:53:25.326209 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c5277db9bb9ec4df7899fcf168bdffb3054ddd34c7a431b7b5daed21394a0bd\": container with ID starting with 0c5277db9bb9ec4df7899fcf168bdffb3054ddd34c7a431b7b5daed21394a0bd not found: ID does not exist" containerID="0c5277db9bb9ec4df7899fcf168bdffb3054ddd34c7a431b7b5daed21394a0bd" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.326231 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5277db9bb9ec4df7899fcf168bdffb3054ddd34c7a431b7b5daed21394a0bd"} err="failed to get container status \"0c5277db9bb9ec4df7899fcf168bdffb3054ddd34c7a431b7b5daed21394a0bd\": rpc error: code = NotFound desc = could not find container \"0c5277db9bb9ec4df7899fcf168bdffb3054ddd34c7a431b7b5daed21394a0bd\": container with ID starting with 0c5277db9bb9ec4df7899fcf168bdffb3054ddd34c7a431b7b5daed21394a0bd not found: ID does not exist" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.326244 4764 scope.go:117] "RemoveContainer" containerID="efdb1c50354bd7290a47b09d50126eb2c93cc916abf31924656b5c9948a3705c" Mar 20 15:53:25 crc kubenswrapper[4764]: E0320 15:53:25.326713 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efdb1c50354bd7290a47b09d50126eb2c93cc916abf31924656b5c9948a3705c\": container with ID starting with efdb1c50354bd7290a47b09d50126eb2c93cc916abf31924656b5c9948a3705c not found: ID does not exist" containerID="efdb1c50354bd7290a47b09d50126eb2c93cc916abf31924656b5c9948a3705c" Mar 20 15:53:25 crc kubenswrapper[4764]: I0320 15:53:25.326735 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdb1c50354bd7290a47b09d50126eb2c93cc916abf31924656b5c9948a3705c"} err="failed to get container status \"efdb1c50354bd7290a47b09d50126eb2c93cc916abf31924656b5c9948a3705c\": rpc error: code = NotFound desc = could not find container \"efdb1c50354bd7290a47b09d50126eb2c93cc916abf31924656b5c9948a3705c\": container with ID starting with efdb1c50354bd7290a47b09d50126eb2c93cc916abf31924656b5c9948a3705c not found: ID does not exist" Mar 20 15:53:27 crc kubenswrapper[4764]: I0320 15:53:27.126161 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:53:27 crc kubenswrapper[4764]: E0320 15:53:27.127822 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:53:27 crc kubenswrapper[4764]: I0320 15:53:27.142498 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b7a677-a6ec-4e79-8145-88a5b08158ee" path="/var/lib/kubelet/pods/21b7a677-a6ec-4e79-8145-88a5b08158ee/volumes" Mar 20 15:53:39 crc kubenswrapper[4764]: I0320 15:53:39.140657 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:53:39 crc kubenswrapper[4764]: E0320 15:53:39.141546 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:53:54 crc kubenswrapper[4764]: I0320 15:53:54.126494 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:53:54 crc kubenswrapper[4764]: E0320 15:53:54.127645 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.161888 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567034-bpg9r"] Mar 20 15:54:00 crc kubenswrapper[4764]: E0320 15:54:00.163216 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b7a677-a6ec-4e79-8145-88a5b08158ee" containerName="registry-server" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.163243 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b7a677-a6ec-4e79-8145-88a5b08158ee" containerName="registry-server" Mar 20 15:54:00 crc kubenswrapper[4764]: E0320 15:54:00.163286 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b7a677-a6ec-4e79-8145-88a5b08158ee" containerName="extract-content" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.163299 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b7a677-a6ec-4e79-8145-88a5b08158ee" containerName="extract-content" Mar 20 15:54:00 crc kubenswrapper[4764]: E0320 15:54:00.163331 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b7a677-a6ec-4e79-8145-88a5b08158ee" containerName="extract-utilities" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.163345 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b7a677-a6ec-4e79-8145-88a5b08158ee" containerName="extract-utilities" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.163717 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b7a677-a6ec-4e79-8145-88a5b08158ee" containerName="registry-server" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.164732 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567034-bpg9r" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.167371 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.167445 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.169215 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.169270 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567034-bpg9r"] Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.314315 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlll2\" (UniqueName: \"kubernetes.io/projected/d6464809-48a0-4768-a2d8-4391ccfc04c4-kube-api-access-mlll2\") pod \"auto-csr-approver-29567034-bpg9r\" (UID: \"d6464809-48a0-4768-a2d8-4391ccfc04c4\") " pod="openshift-infra/auto-csr-approver-29567034-bpg9r" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.416249 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlll2\" (UniqueName: \"kubernetes.io/projected/d6464809-48a0-4768-a2d8-4391ccfc04c4-kube-api-access-mlll2\") pod \"auto-csr-approver-29567034-bpg9r\" (UID: \"d6464809-48a0-4768-a2d8-4391ccfc04c4\") " pod="openshift-infra/auto-csr-approver-29567034-bpg9r" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.441747 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlll2\" (UniqueName: \"kubernetes.io/projected/d6464809-48a0-4768-a2d8-4391ccfc04c4-kube-api-access-mlll2\") pod \"auto-csr-approver-29567034-bpg9r\" (UID: \"d6464809-48a0-4768-a2d8-4391ccfc04c4\") " pod="openshift-infra/auto-csr-approver-29567034-bpg9r" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.485710 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567034-bpg9r" Mar 20 15:54:00 crc kubenswrapper[4764]: I0320 15:54:00.971134 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567034-bpg9r"] Mar 20 15:54:01 crc kubenswrapper[4764]: I0320 15:54:01.569136 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567034-bpg9r" event={"ID":"d6464809-48a0-4768-a2d8-4391ccfc04c4","Type":"ContainerStarted","Data":"c8a4eb8918c8a1c579001262eac25ad73e199b4f4afc22bd62424e4a3c32df2f"} Mar 20 15:54:02 crc kubenswrapper[4764]: I0320 15:54:02.583220 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567034-bpg9r" event={"ID":"d6464809-48a0-4768-a2d8-4391ccfc04c4","Type":"ContainerStarted","Data":"9ac16496a500ba03f349d47ad03ba524d07870ddba98e0ef7ac156b0dcaf6b36"} Mar 20 15:54:03 crc kubenswrapper[4764]: I0320 15:54:03.601663 4764 generic.go:334] "Generic (PLEG): container finished" podID="d6464809-48a0-4768-a2d8-4391ccfc04c4" containerID="9ac16496a500ba03f349d47ad03ba524d07870ddba98e0ef7ac156b0dcaf6b36" exitCode=0 Mar 20 15:54:03 crc kubenswrapper[4764]: I0320 15:54:03.601727 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567034-bpg9r" event={"ID":"d6464809-48a0-4768-a2d8-4391ccfc04c4","Type":"ContainerDied","Data":"9ac16496a500ba03f349d47ad03ba524d07870ddba98e0ef7ac156b0dcaf6b36"} Mar 20 15:54:05 crc kubenswrapper[4764]: I0320 15:54:05.101255 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567034-bpg9r" Mar 20 15:54:05 crc kubenswrapper[4764]: I0320 15:54:05.226075 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlll2\" (UniqueName: \"kubernetes.io/projected/d6464809-48a0-4768-a2d8-4391ccfc04c4-kube-api-access-mlll2\") pod \"d6464809-48a0-4768-a2d8-4391ccfc04c4\" (UID: \"d6464809-48a0-4768-a2d8-4391ccfc04c4\") " Mar 20 15:54:05 crc kubenswrapper[4764]: I0320 15:54:05.233867 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6464809-48a0-4768-a2d8-4391ccfc04c4-kube-api-access-mlll2" (OuterVolumeSpecName: "kube-api-access-mlll2") pod "d6464809-48a0-4768-a2d8-4391ccfc04c4" (UID: "d6464809-48a0-4768-a2d8-4391ccfc04c4"). InnerVolumeSpecName "kube-api-access-mlll2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:54:05 crc kubenswrapper[4764]: I0320 15:54:05.331356 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlll2\" (UniqueName: \"kubernetes.io/projected/d6464809-48a0-4768-a2d8-4391ccfc04c4-kube-api-access-mlll2\") on node \"crc\" DevicePath \"\"" Mar 20 15:54:05 crc kubenswrapper[4764]: I0320 15:54:05.626856 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567034-bpg9r" event={"ID":"d6464809-48a0-4768-a2d8-4391ccfc04c4","Type":"ContainerDied","Data":"c8a4eb8918c8a1c579001262eac25ad73e199b4f4afc22bd62424e4a3c32df2f"} Mar 20 15:54:05 crc kubenswrapper[4764]: I0320 15:54:05.627165 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8a4eb8918c8a1c579001262eac25ad73e199b4f4afc22bd62424e4a3c32df2f" Mar 20 15:54:05 crc kubenswrapper[4764]: I0320 15:54:05.626940 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567034-bpg9r" Mar 20 15:54:05 crc kubenswrapper[4764]: I0320 15:54:05.681250 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567028-v4vr7"] Mar 20 15:54:05 crc kubenswrapper[4764]: I0320 15:54:05.696078 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567028-v4vr7"] Mar 20 15:54:07 crc kubenswrapper[4764]: I0320 15:54:07.137958 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7754fa43-1f4e-413a-be02-fe767c0eeb75" path="/var/lib/kubelet/pods/7754fa43-1f4e-413a-be02-fe767c0eeb75/volumes" Mar 20 15:54:08 crc kubenswrapper[4764]: I0320 15:54:08.127006 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:54:08 crc kubenswrapper[4764]: E0320 15:54:08.127348 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:54:11 crc kubenswrapper[4764]: I0320 15:54:11.376151 4764 scope.go:117] "RemoveContainer" containerID="3e4d0dbb012b759ceacb498f1497e71ce18858a4f32749b28452de2cfc5d7c57" Mar 20 15:54:22 crc kubenswrapper[4764]: I0320 15:54:22.126901 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:54:22 crc kubenswrapper[4764]: E0320 15:54:22.127938 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:54:33 crc kubenswrapper[4764]: I0320 15:54:33.126812 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:54:33 crc kubenswrapper[4764]: E0320 15:54:33.127732 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:54:48 crc kubenswrapper[4764]: I0320 15:54:48.126919 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:54:48 crc kubenswrapper[4764]: E0320 15:54:48.128827 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:55:03 crc kubenswrapper[4764]: I0320 15:55:03.126353 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:55:03 crc kubenswrapper[4764]: E0320 15:55:03.127817 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:55:17 crc kubenswrapper[4764]: I0320 15:55:17.127612 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:55:17 crc kubenswrapper[4764]: E0320 15:55:17.129308 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:55:30 crc kubenswrapper[4764]: I0320 15:55:30.126854 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:55:30 crc kubenswrapper[4764]: E0320 15:55:30.130365 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:55:42 crc kubenswrapper[4764]: I0320 15:55:42.126362 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:55:42 crc kubenswrapper[4764]: E0320 15:55:42.127267 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:55:55 crc kubenswrapper[4764]: I0320 15:55:55.126170 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:55:55 crc kubenswrapper[4764]: E0320 15:55:55.126878 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.154509 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567036-7hjfl"] Mar 20 15:56:00 crc kubenswrapper[4764]: E0320 15:56:00.155584 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6464809-48a0-4768-a2d8-4391ccfc04c4" containerName="oc" Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.155604 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6464809-48a0-4768-a2d8-4391ccfc04c4" containerName="oc" Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.155825 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6464809-48a0-4768-a2d8-4391ccfc04c4" containerName="oc" Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.156578 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567036-7hjfl" Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.159343 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.159874 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.160760 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.172726 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567036-7hjfl"] Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.255214 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tkzk\" (UniqueName: \"kubernetes.io/projected/0022e3ae-b53d-4b86-9c04-bd57f64528c5-kube-api-access-4tkzk\") pod \"auto-csr-approver-29567036-7hjfl\" (UID: \"0022e3ae-b53d-4b86-9c04-bd57f64528c5\") " pod="openshift-infra/auto-csr-approver-29567036-7hjfl" Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.356481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tkzk\" (UniqueName: \"kubernetes.io/projected/0022e3ae-b53d-4b86-9c04-bd57f64528c5-kube-api-access-4tkzk\") pod \"auto-csr-approver-29567036-7hjfl\" (UID: \"0022e3ae-b53d-4b86-9c04-bd57f64528c5\") " pod="openshift-infra/auto-csr-approver-29567036-7hjfl" Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.373124 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tkzk\" (UniqueName: \"kubernetes.io/projected/0022e3ae-b53d-4b86-9c04-bd57f64528c5-kube-api-access-4tkzk\") pod \"auto-csr-approver-29567036-7hjfl\" (UID: \"0022e3ae-b53d-4b86-9c04-bd57f64528c5\") " pod="openshift-infra/auto-csr-approver-29567036-7hjfl" Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.474210 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567036-7hjfl" Mar 20 15:56:00 crc kubenswrapper[4764]: I0320 15:56:00.943247 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567036-7hjfl"] Mar 20 15:56:01 crc kubenswrapper[4764]: I0320 15:56:01.871655 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567036-7hjfl" event={"ID":"0022e3ae-b53d-4b86-9c04-bd57f64528c5","Type":"ContainerStarted","Data":"08fbe09fa8889cd1dc76c161982f5aa566e8a31b03a6e32d6d9d3298944d8674"} Mar 20 15:56:02 crc kubenswrapper[4764]: I0320 15:56:02.882106 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567036-7hjfl" event={"ID":"0022e3ae-b53d-4b86-9c04-bd57f64528c5","Type":"ContainerStarted","Data":"febddfc67e869fd54db9f23fa82570c0685aef0dfe210f7b92e440a08ead66c5"} Mar 20 15:56:02 crc kubenswrapper[4764]: I0320 15:56:02.900499 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567036-7hjfl" podStartSLOduration=1.478786416 podStartE2EDuration="2.900480723s" podCreationTimestamp="2026-03-20 15:56:00 +0000 UTC" firstStartedPulling="2026-03-20 15:56:00.936765203 +0000 UTC m=+3882.552954372" lastFinishedPulling="2026-03-20 15:56:02.35845952 +0000 UTC m=+3883.974648679" observedRunningTime="2026-03-20 15:56:02.896109488 +0000 UTC m=+3884.512298647" watchObservedRunningTime="2026-03-20 15:56:02.900480723 +0000 UTC m=+3884.516669852" Mar 20 15:56:03 crc kubenswrapper[4764]: I0320 15:56:03.900573 4764 generic.go:334] "Generic (PLEG): container finished" podID="0022e3ae-b53d-4b86-9c04-bd57f64528c5" containerID="febddfc67e869fd54db9f23fa82570c0685aef0dfe210f7b92e440a08ead66c5" exitCode=0 Mar 20 15:56:03 crc kubenswrapper[4764]: I0320 15:56:03.900876 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567036-7hjfl" event={"ID":"0022e3ae-b53d-4b86-9c04-bd57f64528c5","Type":"ContainerDied","Data":"febddfc67e869fd54db9f23fa82570c0685aef0dfe210f7b92e440a08ead66c5"} Mar 20 15:56:05 crc kubenswrapper[4764]: I0320 15:56:05.364929 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567036-7hjfl" Mar 20 15:56:05 crc kubenswrapper[4764]: I0320 15:56:05.559226 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tkzk\" (UniqueName: \"kubernetes.io/projected/0022e3ae-b53d-4b86-9c04-bd57f64528c5-kube-api-access-4tkzk\") pod \"0022e3ae-b53d-4b86-9c04-bd57f64528c5\" (UID: \"0022e3ae-b53d-4b86-9c04-bd57f64528c5\") " Mar 20 15:56:05 crc kubenswrapper[4764]: I0320 15:56:05.570518 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0022e3ae-b53d-4b86-9c04-bd57f64528c5-kube-api-access-4tkzk" (OuterVolumeSpecName: "kube-api-access-4tkzk") pod "0022e3ae-b53d-4b86-9c04-bd57f64528c5" (UID: "0022e3ae-b53d-4b86-9c04-bd57f64528c5"). InnerVolumeSpecName "kube-api-access-4tkzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:56:05 crc kubenswrapper[4764]: I0320 15:56:05.661278 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tkzk\" (UniqueName: \"kubernetes.io/projected/0022e3ae-b53d-4b86-9c04-bd57f64528c5-kube-api-access-4tkzk\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:05 crc kubenswrapper[4764]: I0320 15:56:05.938544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567036-7hjfl" event={"ID":"0022e3ae-b53d-4b86-9c04-bd57f64528c5","Type":"ContainerDied","Data":"08fbe09fa8889cd1dc76c161982f5aa566e8a31b03a6e32d6d9d3298944d8674"} Mar 20 15:56:05 crc kubenswrapper[4764]: I0320 15:56:05.939010 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08fbe09fa8889cd1dc76c161982f5aa566e8a31b03a6e32d6d9d3298944d8674" Mar 20 15:56:05 crc kubenswrapper[4764]: I0320 15:56:05.938585 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567036-7hjfl" Mar 20 15:56:05 crc kubenswrapper[4764]: I0320 15:56:05.975593 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567030-5xjss"] Mar 20 15:56:05 crc kubenswrapper[4764]: I0320 15:56:05.982975 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567030-5xjss"] Mar 20 15:56:07 crc kubenswrapper[4764]: I0320 15:56:07.138134 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="550adf3b-64b2-49b3-8e67-90eb76630aec" path="/var/lib/kubelet/pods/550adf3b-64b2-49b3-8e67-90eb76630aec/volumes" Mar 20 15:56:09 crc kubenswrapper[4764]: I0320 15:56:09.988653 4764 generic.go:334] "Generic (PLEG): container finished" podID="2f991298-5b9e-4568-b8b0-24d9d1978a6d" containerID="058c9a488281413fee8bc893a180ce8831d7178f57e7eb01f953e98e602d3b66" exitCode=1 Mar 20 15:56:09 crc kubenswrapper[4764]: I0320 15:56:09.988756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2f991298-5b9e-4568-b8b0-24d9d1978a6d","Type":"ContainerDied","Data":"058c9a488281413fee8bc893a180ce8831d7178f57e7eb01f953e98e602d3b66"} Mar 20 15:56:10 crc kubenswrapper[4764]: I0320 15:56:10.126661 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:56:10 crc kubenswrapper[4764]: E0320 15:56:10.127006 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.535608 4764 scope.go:117] "RemoveContainer" containerID="149134fe9fb329177fa80f53e202278d3eacf06969c5c96deec9799d601dc542" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.544077 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.702525 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-ca-certs\") pod \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.702583 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-openstack-config-secret\") pod \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.702612 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2f991298-5b9e-4568-b8b0-24d9d1978a6d-test-operator-ephemeral-temporary\") pod \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.702651 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.702683 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-ssh-key\") pod \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.702744 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2f991298-5b9e-4568-b8b0-24d9d1978a6d-test-operator-ephemeral-workdir\") pod \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.702767 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cjp6\" (UniqueName: \"kubernetes.io/projected/2f991298-5b9e-4568-b8b0-24d9d1978a6d-kube-api-access-7cjp6\") pod \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.702842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f991298-5b9e-4568-b8b0-24d9d1978a6d-config-data\") pod \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.702858 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2f991298-5b9e-4568-b8b0-24d9d1978a6d-openstack-config\") pod \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\" (UID: \"2f991298-5b9e-4568-b8b0-24d9d1978a6d\") " Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.704071 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f991298-5b9e-4568-b8b0-24d9d1978a6d-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "2f991298-5b9e-4568-b8b0-24d9d1978a6d" (UID: "2f991298-5b9e-4568-b8b0-24d9d1978a6d"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.705037 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f991298-5b9e-4568-b8b0-24d9d1978a6d-config-data" (OuterVolumeSpecName: "config-data") pod "2f991298-5b9e-4568-b8b0-24d9d1978a6d" (UID: "2f991298-5b9e-4568-b8b0-24d9d1978a6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.708316 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f991298-5b9e-4568-b8b0-24d9d1978a6d-kube-api-access-7cjp6" (OuterVolumeSpecName: "kube-api-access-7cjp6") pod "2f991298-5b9e-4568-b8b0-24d9d1978a6d" (UID: "2f991298-5b9e-4568-b8b0-24d9d1978a6d"). InnerVolumeSpecName "kube-api-access-7cjp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.709928 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "2f991298-5b9e-4568-b8b0-24d9d1978a6d" (UID: "2f991298-5b9e-4568-b8b0-24d9d1978a6d"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.710267 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f991298-5b9e-4568-b8b0-24d9d1978a6d-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "2f991298-5b9e-4568-b8b0-24d9d1978a6d" (UID: "2f991298-5b9e-4568-b8b0-24d9d1978a6d"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.734313 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "2f991298-5b9e-4568-b8b0-24d9d1978a6d" (UID: "2f991298-5b9e-4568-b8b0-24d9d1978a6d"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.738081 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "2f991298-5b9e-4568-b8b0-24d9d1978a6d" (UID: "2f991298-5b9e-4568-b8b0-24d9d1978a6d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.738445 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2f991298-5b9e-4568-b8b0-24d9d1978a6d" (UID: "2f991298-5b9e-4568-b8b0-24d9d1978a6d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.749694 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f991298-5b9e-4568-b8b0-24d9d1978a6d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "2f991298-5b9e-4568-b8b0-24d9d1978a6d" (UID: "2f991298-5b9e-4568-b8b0-24d9d1978a6d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.805554 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f991298-5b9e-4568-b8b0-24d9d1978a6d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.805592 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2f991298-5b9e-4568-b8b0-24d9d1978a6d-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.805603 4764 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.805614 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.805627 4764 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2f991298-5b9e-4568-b8b0-24d9d1978a6d-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.805664 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.805676 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2f991298-5b9e-4568-b8b0-24d9d1978a6d-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.805689 4764 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2f991298-5b9e-4568-b8b0-24d9d1978a6d-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.805701 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cjp6\" (UniqueName: \"kubernetes.io/projected/2f991298-5b9e-4568-b8b0-24d9d1978a6d-kube-api-access-7cjp6\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.828514 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 20 15:56:11 crc kubenswrapper[4764]: I0320 15:56:11.906757 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 20 15:56:12 crc kubenswrapper[4764]: I0320 15:56:12.013876 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2f991298-5b9e-4568-b8b0-24d9d1978a6d","Type":"ContainerDied","Data":"faef668f6730dd27ada870308aa434e91755c8950748d509c212ee2fa844b095"} Mar 20 15:56:12 crc kubenswrapper[4764]: I0320 15:56:12.014213 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faef668f6730dd27ada870308aa434e91755c8950748d509c212ee2fa844b095" Mar 20 15:56:12 crc kubenswrapper[4764]: I0320 15:56:12.013973 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.126990 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:56:23 crc kubenswrapper[4764]: E0320 15:56:23.127988 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.341794 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 15:56:23 crc kubenswrapper[4764]: E0320 15:56:23.342181 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0022e3ae-b53d-4b86-9c04-bd57f64528c5" containerName="oc" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.342198 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0022e3ae-b53d-4b86-9c04-bd57f64528c5" containerName="oc" Mar 20 15:56:23 crc kubenswrapper[4764]: E0320 15:56:23.342232 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f991298-5b9e-4568-b8b0-24d9d1978a6d" containerName="tempest-tests-tempest-tests-runner" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.342240 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f991298-5b9e-4568-b8b0-24d9d1978a6d" containerName="tempest-tests-tempest-tests-runner" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.342414 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0022e3ae-b53d-4b86-9c04-bd57f64528c5" containerName="oc" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.342442 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f991298-5b9e-4568-b8b0-24d9d1978a6d" containerName="tempest-tests-tempest-tests-runner" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.343026 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.345725 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-b2dfb" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.362528 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.459107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8f8h\" (UniqueName: \"kubernetes.io/projected/8fc64043-bf77-406c-930f-9663e608f2c9-kube-api-access-b8f8h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fc64043-bf77-406c-930f-9663e608f2c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.459238 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fc64043-bf77-406c-930f-9663e608f2c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.560922 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8f8h\" (UniqueName: \"kubernetes.io/projected/8fc64043-bf77-406c-930f-9663e608f2c9-kube-api-access-b8f8h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fc64043-bf77-406c-930f-9663e608f2c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.561026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fc64043-bf77-406c-930f-9663e608f2c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.561569 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fc64043-bf77-406c-930f-9663e608f2c9\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.590171 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fc64043-bf77-406c-930f-9663e608f2c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.591218 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8f8h\" (UniqueName: \"kubernetes.io/projected/8fc64043-bf77-406c-930f-9663e608f2c9-kube-api-access-b8f8h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"8fc64043-bf77-406c-930f-9663e608f2c9\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 15:56:23 crc kubenswrapper[4764]: I0320 15:56:23.681908 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 15:56:24 crc kubenswrapper[4764]: I0320 15:56:24.217323 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 15:56:25 crc kubenswrapper[4764]: I0320 15:56:25.146908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8fc64043-bf77-406c-930f-9663e608f2c9","Type":"ContainerStarted","Data":"d70d680ef7c55891d0dc378be8022205e6dc55e8be81cbd7c4755fd3c7c98844"} Mar 20 15:56:26 crc kubenswrapper[4764]: I0320 15:56:26.160558 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"8fc64043-bf77-406c-930f-9663e608f2c9","Type":"ContainerStarted","Data":"01e84c8c99e5a4624df66fa6f218202f4f571875893fbe53d4256161cea5ac10"} Mar 20 15:56:26 crc kubenswrapper[4764]: I0320 15:56:26.189033 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.445866295 podStartE2EDuration="3.188976881s" podCreationTimestamp="2026-03-20 15:56:23 +0000 UTC" firstStartedPulling="2026-03-20 15:56:24.224741825 +0000 UTC m=+3905.840930954" lastFinishedPulling="2026-03-20 15:56:24.967852411 +0000 UTC m=+3906.584041540" observedRunningTime="2026-03-20 15:56:26.178804067 +0000 UTC m=+3907.794993206" watchObservedRunningTime="2026-03-20 15:56:26.188976881 +0000 UTC m=+3907.805166020" Mar 20 15:56:37 crc kubenswrapper[4764]: I0320 15:56:37.128624 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:56:37 crc kubenswrapper[4764]: E0320 15:56:37.129627 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:56:48 crc kubenswrapper[4764]: I0320 15:56:48.126830 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:56:48 crc kubenswrapper[4764]: E0320 15:56:48.127737 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:57:03 crc kubenswrapper[4764]: I0320 15:57:03.127058 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:57:03 crc kubenswrapper[4764]: E0320 15:57:03.128054 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 15:57:16 crc kubenswrapper[4764]: I0320 15:57:16.127009 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 15:57:16 crc kubenswrapper[4764]: I0320 15:57:16.701866 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"57e4847909fe1be173f31014a6d9365176dfb85598b100075a9cb0f8fc8dac5f"} Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.060328 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gtskb/must-gather-gvc5c"] Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.062146 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/must-gather-gvc5c" Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.063690 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gtskb"/"default-dockercfg-7w8jw" Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.064628 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gtskb"/"kube-root-ca.crt" Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.068079 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gtskb"/"openshift-service-ca.crt" Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.080185 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gtskb/must-gather-gvc5c"] Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.137867 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13-must-gather-output\") pod \"must-gather-gvc5c\" (UID: \"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13\") " pod="openshift-must-gather-gtskb/must-gather-gvc5c" Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.137977 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdkgv\" (UniqueName: \"kubernetes.io/projected/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13-kube-api-access-bdkgv\") pod \"must-gather-gvc5c\" (UID: \"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13\") " pod="openshift-must-gather-gtskb/must-gather-gvc5c" Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.239353 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13-must-gather-output\") pod \"must-gather-gvc5c\" (UID: \"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13\") " pod="openshift-must-gather-gtskb/must-gather-gvc5c" Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.239676 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdkgv\" (UniqueName: \"kubernetes.io/projected/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13-kube-api-access-bdkgv\") pod \"must-gather-gvc5c\" (UID: \"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13\") " pod="openshift-must-gather-gtskb/must-gather-gvc5c" Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.239983 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13-must-gather-output\") pod \"must-gather-gvc5c\" (UID: \"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13\") " pod="openshift-must-gather-gtskb/must-gather-gvc5c" Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.261399 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdkgv\" (UniqueName: \"kubernetes.io/projected/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13-kube-api-access-bdkgv\") pod \"must-gather-gvc5c\" (UID: \"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13\") " pod="openshift-must-gather-gtskb/must-gather-gvc5c" Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.381818 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/must-gather-gvc5c" Mar 20 15:57:17 crc kubenswrapper[4764]: I0320 15:57:17.853287 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gtskb/must-gather-gvc5c"] Mar 20 15:57:18 crc kubenswrapper[4764]: I0320 15:57:18.719524 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/must-gather-gvc5c" event={"ID":"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13","Type":"ContainerStarted","Data":"d8989fe73c475f4ddcb7de02cd698e97624df78d2761ebe6cb9e4cb227e111d6"} Mar 20 15:57:22 crc kubenswrapper[4764]: I0320 15:57:22.796656 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/must-gather-gvc5c" event={"ID":"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13","Type":"ContainerStarted","Data":"6da3866c15059f3ade16ff9e4ea75d213f3b63ba3e9cba6be3c114c7c16e8e6f"} Mar 20 15:57:22 crc kubenswrapper[4764]: I0320 15:57:22.798672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/must-gather-gvc5c" event={"ID":"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13","Type":"ContainerStarted","Data":"1621eb75720d41d35965fd43babccf5f2859e3dc7027bffd2a1201c019dbd437"} Mar 20 15:57:22 crc kubenswrapper[4764]: I0320 15:57:22.819473 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gtskb/must-gather-gvc5c" podStartSLOduration=1.6059786950000001 podStartE2EDuration="5.819415801s" podCreationTimestamp="2026-03-20 15:57:17 +0000 UTC" firstStartedPulling="2026-03-20 15:57:17.856005853 +0000 UTC m=+3959.472194982" lastFinishedPulling="2026-03-20 15:57:22.069442949 +0000 UTC m=+3963.685632088" observedRunningTime="2026-03-20 15:57:22.81160828 +0000 UTC m=+3964.427797409" watchObservedRunningTime="2026-03-20 15:57:22.819415801 +0000 UTC m=+3964.435604930" Mar 20 15:57:27 crc kubenswrapper[4764]: I0320 15:57:27.138873 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gtskb/crc-debug-hrp8p"] Mar 20 15:57:27 crc kubenswrapper[4764]: I0320 15:57:27.141969 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/crc-debug-hrp8p" Mar 20 15:57:27 crc kubenswrapper[4764]: I0320 15:57:27.276475 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6w6j\" (UniqueName: \"kubernetes.io/projected/8101f58d-2cb2-48f5-bf51-7020f021699f-kube-api-access-l6w6j\") pod \"crc-debug-hrp8p\" (UID: \"8101f58d-2cb2-48f5-bf51-7020f021699f\") " pod="openshift-must-gather-gtskb/crc-debug-hrp8p" Mar 20 15:57:27 crc kubenswrapper[4764]: I0320 15:57:27.276857 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8101f58d-2cb2-48f5-bf51-7020f021699f-host\") pod \"crc-debug-hrp8p\" (UID: \"8101f58d-2cb2-48f5-bf51-7020f021699f\") " pod="openshift-must-gather-gtskb/crc-debug-hrp8p" Mar 20 15:57:27 crc kubenswrapper[4764]: I0320 15:57:27.379641 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6w6j\" (UniqueName: \"kubernetes.io/projected/8101f58d-2cb2-48f5-bf51-7020f021699f-kube-api-access-l6w6j\") pod \"crc-debug-hrp8p\" (UID: \"8101f58d-2cb2-48f5-bf51-7020f021699f\") " pod="openshift-must-gather-gtskb/crc-debug-hrp8p" Mar 20 15:57:27 crc kubenswrapper[4764]: I0320 15:57:27.379685 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8101f58d-2cb2-48f5-bf51-7020f021699f-host\") pod \"crc-debug-hrp8p\" (UID: \"8101f58d-2cb2-48f5-bf51-7020f021699f\") " pod="openshift-must-gather-gtskb/crc-debug-hrp8p" Mar 20 15:57:27 crc kubenswrapper[4764]: I0320 15:57:27.379903 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8101f58d-2cb2-48f5-bf51-7020f021699f-host\") pod \"crc-debug-hrp8p\" (UID: \"8101f58d-2cb2-48f5-bf51-7020f021699f\") " pod="openshift-must-gather-gtskb/crc-debug-hrp8p" Mar 20 15:57:27 crc kubenswrapper[4764]: I0320 15:57:27.402448 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6w6j\" (UniqueName: \"kubernetes.io/projected/8101f58d-2cb2-48f5-bf51-7020f021699f-kube-api-access-l6w6j\") pod \"crc-debug-hrp8p\" (UID: \"8101f58d-2cb2-48f5-bf51-7020f021699f\") " pod="openshift-must-gather-gtskb/crc-debug-hrp8p" Mar 20 15:57:27 crc kubenswrapper[4764]: I0320 15:57:27.463516 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/crc-debug-hrp8p" Mar 20 15:57:27 crc kubenswrapper[4764]: W0320 15:57:27.502895 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8101f58d_2cb2_48f5_bf51_7020f021699f.slice/crio-d2a725b2b44a132059aa7f4b0954be26c7dab44717e4d2cd2d02806de17ad536 WatchSource:0}: Error finding container d2a725b2b44a132059aa7f4b0954be26c7dab44717e4d2cd2d02806de17ad536: Status 404 returned error can't find the container with id d2a725b2b44a132059aa7f4b0954be26c7dab44717e4d2cd2d02806de17ad536 Mar 20 15:57:27 crc kubenswrapper[4764]: I0320 15:57:27.843656 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/crc-debug-hrp8p" event={"ID":"8101f58d-2cb2-48f5-bf51-7020f021699f","Type":"ContainerStarted","Data":"d2a725b2b44a132059aa7f4b0954be26c7dab44717e4d2cd2d02806de17ad536"} Mar 20 15:57:39 crc kubenswrapper[4764]: I0320 15:57:39.944455 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/crc-debug-hrp8p" event={"ID":"8101f58d-2cb2-48f5-bf51-7020f021699f","Type":"ContainerStarted","Data":"41bf334453cfa54e9f75adb3040a4129857833408c37eac2463ab7d30ab28294"} Mar 20 15:57:39 crc kubenswrapper[4764]: I0320 15:57:39.978159 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gtskb/crc-debug-hrp8p" podStartSLOduration=1.423624754 podStartE2EDuration="12.978140498s" podCreationTimestamp="2026-03-20 15:57:27 +0000 UTC" firstStartedPulling="2026-03-20 15:57:27.505326154 +0000 UTC m=+3969.121515283" lastFinishedPulling="2026-03-20 15:57:39.059841898 +0000 UTC m=+3980.676031027" observedRunningTime="2026-03-20 15:57:39.977614721 +0000 UTC m=+3981.593803870" watchObservedRunningTime="2026-03-20 15:57:39.978140498 +0000 UTC m=+3981.594329647" Mar 20 15:58:00 crc kubenswrapper[4764]: I0320 15:58:00.137726 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567038-vzfgd"] Mar 20 15:58:00 crc kubenswrapper[4764]: I0320 15:58:00.140094 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567038-vzfgd" Mar 20 15:58:00 crc kubenswrapper[4764]: I0320 15:58:00.142311 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:58:00 crc kubenswrapper[4764]: I0320 15:58:00.142554 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 15:58:00 crc kubenswrapper[4764]: I0320 15:58:00.142814 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:58:00 crc kubenswrapper[4764]: I0320 15:58:00.147946 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567038-vzfgd"] Mar 20 15:58:00 crc kubenswrapper[4764]: I0320 15:58:00.153261 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz5g6\" (UniqueName: \"kubernetes.io/projected/4c296ef8-0537-4c67-a4cb-bcd3fe6211d6-kube-api-access-bz5g6\") pod \"auto-csr-approver-29567038-vzfgd\" (UID: \"4c296ef8-0537-4c67-a4cb-bcd3fe6211d6\") " pod="openshift-infra/auto-csr-approver-29567038-vzfgd" Mar 20 15:58:00 crc kubenswrapper[4764]: I0320 15:58:00.256773 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz5g6\" (UniqueName: \"kubernetes.io/projected/4c296ef8-0537-4c67-a4cb-bcd3fe6211d6-kube-api-access-bz5g6\") pod \"auto-csr-approver-29567038-vzfgd\" (UID: \"4c296ef8-0537-4c67-a4cb-bcd3fe6211d6\") " pod="openshift-infra/auto-csr-approver-29567038-vzfgd" Mar 20 15:58:00 crc kubenswrapper[4764]: I0320 15:58:00.276620 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz5g6\" (UniqueName: \"kubernetes.io/projected/4c296ef8-0537-4c67-a4cb-bcd3fe6211d6-kube-api-access-bz5g6\") pod \"auto-csr-approver-29567038-vzfgd\" (UID: \"4c296ef8-0537-4c67-a4cb-bcd3fe6211d6\") " pod="openshift-infra/auto-csr-approver-29567038-vzfgd" Mar 20 15:58:00 crc kubenswrapper[4764]: I0320 15:58:00.456705 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567038-vzfgd" Mar 20 15:58:00 crc kubenswrapper[4764]: I0320 15:58:00.960157 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567038-vzfgd"] Mar 20 15:58:01 crc kubenswrapper[4764]: I0320 15:58:01.112506 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567038-vzfgd" event={"ID":"4c296ef8-0537-4c67-a4cb-bcd3fe6211d6","Type":"ContainerStarted","Data":"74e4d5e055a3452ba4b6e86f54b170aa2a6c3ed920379a5abe853f3dbef5fd64"} Mar 20 15:58:03 crc kubenswrapper[4764]: I0320 15:58:03.130744 4764 generic.go:334] "Generic (PLEG): container finished" podID="4c296ef8-0537-4c67-a4cb-bcd3fe6211d6" containerID="0440f51eca52947dbe76d0dec6fec5fe69c76555741152e187ee1e4bb83eadde" exitCode=0 Mar 20 15:58:03 crc kubenswrapper[4764]: I0320 15:58:03.138740 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567038-vzfgd" event={"ID":"4c296ef8-0537-4c67-a4cb-bcd3fe6211d6","Type":"ContainerDied","Data":"0440f51eca52947dbe76d0dec6fec5fe69c76555741152e187ee1e4bb83eadde"} Mar 20 15:58:04 crc kubenswrapper[4764]: I0320 15:58:04.497467 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567038-vzfgd" Mar 20 15:58:04 crc kubenswrapper[4764]: I0320 15:58:04.642513 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz5g6\" (UniqueName: \"kubernetes.io/projected/4c296ef8-0537-4c67-a4cb-bcd3fe6211d6-kube-api-access-bz5g6\") pod \"4c296ef8-0537-4c67-a4cb-bcd3fe6211d6\" (UID: \"4c296ef8-0537-4c67-a4cb-bcd3fe6211d6\") " Mar 20 15:58:04 crc kubenswrapper[4764]: I0320 15:58:04.647945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c296ef8-0537-4c67-a4cb-bcd3fe6211d6-kube-api-access-bz5g6" (OuterVolumeSpecName: "kube-api-access-bz5g6") pod "4c296ef8-0537-4c67-a4cb-bcd3fe6211d6" (UID: "4c296ef8-0537-4c67-a4cb-bcd3fe6211d6"). InnerVolumeSpecName "kube-api-access-bz5g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:58:04 crc kubenswrapper[4764]: I0320 15:58:04.744953 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz5g6\" (UniqueName: \"kubernetes.io/projected/4c296ef8-0537-4c67-a4cb-bcd3fe6211d6-kube-api-access-bz5g6\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:05 crc kubenswrapper[4764]: I0320 15:58:05.172443 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567038-vzfgd" Mar 20 15:58:05 crc kubenswrapper[4764]: I0320 15:58:05.172468 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567038-vzfgd" event={"ID":"4c296ef8-0537-4c67-a4cb-bcd3fe6211d6","Type":"ContainerDied","Data":"74e4d5e055a3452ba4b6e86f54b170aa2a6c3ed920379a5abe853f3dbef5fd64"} Mar 20 15:58:05 crc kubenswrapper[4764]: I0320 15:58:05.172777 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74e4d5e055a3452ba4b6e86f54b170aa2a6c3ed920379a5abe853f3dbef5fd64" Mar 20 15:58:05 crc kubenswrapper[4764]: I0320 15:58:05.562708 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567032-lvfpn"] Mar 20 15:58:05 crc kubenswrapper[4764]: I0320 15:58:05.570372 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567032-lvfpn"] Mar 20 15:58:07 crc kubenswrapper[4764]: I0320 15:58:07.139588 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2399033e-a5ae-4d16-88bb-c5b019f65bde" path="/var/lib/kubelet/pods/2399033e-a5ae-4d16-88bb-c5b019f65bde/volumes" Mar 20 15:58:11 crc kubenswrapper[4764]: I0320 15:58:11.642105 4764 scope.go:117] "RemoveContainer" containerID="00b6d62499524adfd202ae445e63f8cfbb61968064af9759362b2f7c61c6fd79" Mar 20 15:58:13 crc kubenswrapper[4764]: I0320 15:58:13.843660 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ks2g7"] Mar 20 15:58:13 crc kubenswrapper[4764]: E0320 15:58:13.844585 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c296ef8-0537-4c67-a4cb-bcd3fe6211d6" containerName="oc" Mar 20 15:58:13 crc kubenswrapper[4764]: I0320 15:58:13.844600 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c296ef8-0537-4c67-a4cb-bcd3fe6211d6" containerName="oc" Mar 20 15:58:13 crc kubenswrapper[4764]: I0320 15:58:13.844849 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c296ef8-0537-4c67-a4cb-bcd3fe6211d6" containerName="oc" Mar 20 15:58:13 crc kubenswrapper[4764]: I0320 15:58:13.846999 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:13 crc kubenswrapper[4764]: I0320 15:58:13.864973 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ks2g7"] Mar 20 15:58:13 crc kubenswrapper[4764]: I0320 15:58:13.923726 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4ps8\" (UniqueName: \"kubernetes.io/projected/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-kube-api-access-s4ps8\") pod \"community-operators-ks2g7\" (UID: \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\") " pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:13 crc kubenswrapper[4764]: I0320 15:58:13.923858 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-utilities\") pod \"community-operators-ks2g7\" (UID: \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\") " pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:13 crc kubenswrapper[4764]: I0320 15:58:13.923936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-catalog-content\") pod \"community-operators-ks2g7\" (UID: \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\") " pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:14 crc kubenswrapper[4764]: I0320 15:58:14.025812 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4ps8\" (UniqueName: \"kubernetes.io/projected/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-kube-api-access-s4ps8\") pod \"community-operators-ks2g7\" (UID: \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\") " pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:14 crc kubenswrapper[4764]: I0320 15:58:14.026043 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-utilities\") pod \"community-operators-ks2g7\" (UID: \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\") " pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:14 crc kubenswrapper[4764]: I0320 15:58:14.026152 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-catalog-content\") pod \"community-operators-ks2g7\" (UID: \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\") " pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:14 crc kubenswrapper[4764]: I0320 15:58:14.026593 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-utilities\") pod \"community-operators-ks2g7\" (UID: \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\") " pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:14 crc kubenswrapper[4764]: I0320 15:58:14.026618 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-catalog-content\") pod \"community-operators-ks2g7\" (UID: \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\") " pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:14 crc kubenswrapper[4764]: I0320 15:58:14.048330 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4ps8\" (UniqueName: \"kubernetes.io/projected/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-kube-api-access-s4ps8\") pod \"community-operators-ks2g7\" (UID: \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\") " pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:14 crc kubenswrapper[4764]: I0320 15:58:14.170929 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:14 crc kubenswrapper[4764]: I0320 15:58:14.754930 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ks2g7"] Mar 20 15:58:15 crc kubenswrapper[4764]: I0320 15:58:15.278884 4764 generic.go:334] "Generic (PLEG): container finished" podID="d62d35c7-7183-4922-9cfd-c35c5a69a8fb" containerID="2217bda151b74d090f642455c2bea315cf3a9fc0a8033bdea22d555ab38d6da5" exitCode=0 Mar 20 15:58:15 crc kubenswrapper[4764]: I0320 15:58:15.279008 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks2g7" event={"ID":"d62d35c7-7183-4922-9cfd-c35c5a69a8fb","Type":"ContainerDied","Data":"2217bda151b74d090f642455c2bea315cf3a9fc0a8033bdea22d555ab38d6da5"} Mar 20 15:58:15 crc kubenswrapper[4764]: I0320 15:58:15.279154 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks2g7" event={"ID":"d62d35c7-7183-4922-9cfd-c35c5a69a8fb","Type":"ContainerStarted","Data":"c6e78f3e3e9b70a4f8c04c599b9dbc2d46c0c66d66829fa7f474303c47873ff6"} Mar 20 15:58:15 crc kubenswrapper[4764]: I0320 15:58:15.281114 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:58:16 crc kubenswrapper[4764]: I0320 15:58:16.287626 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks2g7" event={"ID":"d62d35c7-7183-4922-9cfd-c35c5a69a8fb","Type":"ContainerStarted","Data":"3f7ca0ad613b2772157faaa3b50efc1e04c4835d5cc55d7845eacc909f8885ab"} Mar 20 15:58:18 crc kubenswrapper[4764]: I0320 15:58:18.322046 4764 generic.go:334] "Generic (PLEG): container finished" podID="d62d35c7-7183-4922-9cfd-c35c5a69a8fb" containerID="3f7ca0ad613b2772157faaa3b50efc1e04c4835d5cc55d7845eacc909f8885ab" exitCode=0 Mar 20 15:58:18 crc kubenswrapper[4764]: I0320 15:58:18.322538 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks2g7" event={"ID":"d62d35c7-7183-4922-9cfd-c35c5a69a8fb","Type":"ContainerDied","Data":"3f7ca0ad613b2772157faaa3b50efc1e04c4835d5cc55d7845eacc909f8885ab"} Mar 20 15:58:19 crc kubenswrapper[4764]: I0320 15:58:19.333958 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks2g7" event={"ID":"d62d35c7-7183-4922-9cfd-c35c5a69a8fb","Type":"ContainerStarted","Data":"65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f"} Mar 20 15:58:19 crc kubenswrapper[4764]: I0320 15:58:19.358608 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ks2g7" podStartSLOduration=2.8638992119999997 podStartE2EDuration="6.358584061s" podCreationTimestamp="2026-03-20 15:58:13 +0000 UTC" firstStartedPulling="2026-03-20 15:58:15.280840984 +0000 UTC m=+4016.897030113" lastFinishedPulling="2026-03-20 15:58:18.775525823 +0000 UTC m=+4020.391714962" observedRunningTime="2026-03-20 15:58:19.352603747 +0000 UTC m=+4020.968792886" watchObservedRunningTime="2026-03-20 15:58:19.358584061 +0000 UTC m=+4020.974773200" Mar 20 15:58:24 crc kubenswrapper[4764]: I0320 15:58:24.172365 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:24 crc kubenswrapper[4764]: I0320 15:58:24.173739 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:24 crc kubenswrapper[4764]: I0320 15:58:24.213506 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:24 crc kubenswrapper[4764]: I0320 15:58:24.418866 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:24 crc kubenswrapper[4764]: I0320 15:58:24.466849 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ks2g7"] Mar 20 15:58:26 crc kubenswrapper[4764]: I0320 15:58:26.386472 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ks2g7" podUID="d62d35c7-7183-4922-9cfd-c35c5a69a8fb" containerName="registry-server" containerID="cri-o://65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f" gracePeriod=2 Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.219642 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.396679 4764 generic.go:334] "Generic (PLEG): container finished" podID="d62d35c7-7183-4922-9cfd-c35c5a69a8fb" containerID="65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f" exitCode=0 Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.396727 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks2g7" event={"ID":"d62d35c7-7183-4922-9cfd-c35c5a69a8fb","Type":"ContainerDied","Data":"65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f"} Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.396733 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ks2g7" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.396766 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks2g7" event={"ID":"d62d35c7-7183-4922-9cfd-c35c5a69a8fb","Type":"ContainerDied","Data":"c6e78f3e3e9b70a4f8c04c599b9dbc2d46c0c66d66829fa7f474303c47873ff6"} Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.396790 4764 scope.go:117] "RemoveContainer" containerID="65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.407555 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-utilities\") pod \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\" (UID: \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\") " Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.407624 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-catalog-content\") pod \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\" (UID: \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\") " Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.407670 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4ps8\" (UniqueName: \"kubernetes.io/projected/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-kube-api-access-s4ps8\") pod \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\" (UID: \"d62d35c7-7183-4922-9cfd-c35c5a69a8fb\") " Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.409553 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-utilities" (OuterVolumeSpecName: "utilities") pod "d62d35c7-7183-4922-9cfd-c35c5a69a8fb" (UID: "d62d35c7-7183-4922-9cfd-c35c5a69a8fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.414792 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-kube-api-access-s4ps8" (OuterVolumeSpecName: "kube-api-access-s4ps8") pod "d62d35c7-7183-4922-9cfd-c35c5a69a8fb" (UID: "d62d35c7-7183-4922-9cfd-c35c5a69a8fb"). InnerVolumeSpecName "kube-api-access-s4ps8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.429328 4764 scope.go:117] "RemoveContainer" containerID="3f7ca0ad613b2772157faaa3b50efc1e04c4835d5cc55d7845eacc909f8885ab" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.461626 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d62d35c7-7183-4922-9cfd-c35c5a69a8fb" (UID: "d62d35c7-7183-4922-9cfd-c35c5a69a8fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.474825 4764 scope.go:117] "RemoveContainer" containerID="2217bda151b74d090f642455c2bea315cf3a9fc0a8033bdea22d555ab38d6da5" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.508852 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4ps8\" (UniqueName: \"kubernetes.io/projected/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-kube-api-access-s4ps8\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.508884 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.508894 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d62d35c7-7183-4922-9cfd-c35c5a69a8fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.523759 4764 scope.go:117] "RemoveContainer" containerID="65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f" Mar 20 15:58:27 crc kubenswrapper[4764]: E0320 15:58:27.524167 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f\": container with ID starting with 65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f not found: ID does not exist" containerID="65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.524199 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f"} err="failed to get container status \"65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f\": rpc error: code = NotFound desc = could not find container \"65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f\": container with ID starting with 65e207335f2ce7a3604693efc25e298db9a8074b627d41898a4a3626772c9a0f not found: ID does not exist" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.524220 4764 scope.go:117] "RemoveContainer" containerID="3f7ca0ad613b2772157faaa3b50efc1e04c4835d5cc55d7845eacc909f8885ab" Mar 20 15:58:27 crc kubenswrapper[4764]: E0320 15:58:27.524405 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f7ca0ad613b2772157faaa3b50efc1e04c4835d5cc55d7845eacc909f8885ab\": container with ID starting with 3f7ca0ad613b2772157faaa3b50efc1e04c4835d5cc55d7845eacc909f8885ab not found: ID does not exist" containerID="3f7ca0ad613b2772157faaa3b50efc1e04c4835d5cc55d7845eacc909f8885ab" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.524426 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f7ca0ad613b2772157faaa3b50efc1e04c4835d5cc55d7845eacc909f8885ab"} err="failed to get container status \"3f7ca0ad613b2772157faaa3b50efc1e04c4835d5cc55d7845eacc909f8885ab\": rpc error: code = NotFound desc = could not find container \"3f7ca0ad613b2772157faaa3b50efc1e04c4835d5cc55d7845eacc909f8885ab\": container with ID starting with 3f7ca0ad613b2772157faaa3b50efc1e04c4835d5cc55d7845eacc909f8885ab not found: ID does not exist" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.524439 4764 scope.go:117] "RemoveContainer" containerID="2217bda151b74d090f642455c2bea315cf3a9fc0a8033bdea22d555ab38d6da5" Mar 20 15:58:27 crc kubenswrapper[4764]: E0320 15:58:27.524786 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2217bda151b74d090f642455c2bea315cf3a9fc0a8033bdea22d555ab38d6da5\": container with ID starting with 2217bda151b74d090f642455c2bea315cf3a9fc0a8033bdea22d555ab38d6da5 not found: ID does not exist" containerID="2217bda151b74d090f642455c2bea315cf3a9fc0a8033bdea22d555ab38d6da5" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.524838 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2217bda151b74d090f642455c2bea315cf3a9fc0a8033bdea22d555ab38d6da5"} err="failed to get container status \"2217bda151b74d090f642455c2bea315cf3a9fc0a8033bdea22d555ab38d6da5\": rpc error: code = NotFound desc = could not find container \"2217bda151b74d090f642455c2bea315cf3a9fc0a8033bdea22d555ab38d6da5\": container with ID starting with 2217bda151b74d090f642455c2bea315cf3a9fc0a8033bdea22d555ab38d6da5 not found: ID does not exist" Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.737170 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ks2g7"] Mar 20 15:58:27 crc kubenswrapper[4764]: I0320 15:58:27.745720 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ks2g7"] Mar 20 15:58:29 crc kubenswrapper[4764]: I0320 15:58:29.145980 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d62d35c7-7183-4922-9cfd-c35c5a69a8fb" path="/var/lib/kubelet/pods/d62d35c7-7183-4922-9cfd-c35c5a69a8fb/volumes" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.436103 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m6d7s"] Mar 20 15:58:31 crc kubenswrapper[4764]: E0320 15:58:31.437865 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62d35c7-7183-4922-9cfd-c35c5a69a8fb" containerName="registry-server" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.438025 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62d35c7-7183-4922-9cfd-c35c5a69a8fb" containerName="registry-server" Mar 20 15:58:31 crc kubenswrapper[4764]: E0320 15:58:31.438117 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62d35c7-7183-4922-9cfd-c35c5a69a8fb" containerName="extract-utilities" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.438198 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62d35c7-7183-4922-9cfd-c35c5a69a8fb" containerName="extract-utilities" Mar 20 15:58:31 crc kubenswrapper[4764]: E0320 15:58:31.438294 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62d35c7-7183-4922-9cfd-c35c5a69a8fb" containerName="extract-content" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.438401 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62d35c7-7183-4922-9cfd-c35c5a69a8fb" containerName="extract-content" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.438785 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62d35c7-7183-4922-9cfd-c35c5a69a8fb" containerName="registry-server" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.440566 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.452861 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m6d7s"] Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.597225 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d31f1d-c8b9-4964-9111-3289ef91f71b-catalog-content\") pod \"redhat-marketplace-m6d7s\" (UID: \"15d31f1d-c8b9-4964-9111-3289ef91f71b\") " pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.597285 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rp25\" (UniqueName: \"kubernetes.io/projected/15d31f1d-c8b9-4964-9111-3289ef91f71b-kube-api-access-5rp25\") pod \"redhat-marketplace-m6d7s\" (UID: \"15d31f1d-c8b9-4964-9111-3289ef91f71b\") " pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.597361 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d31f1d-c8b9-4964-9111-3289ef91f71b-utilities\") pod \"redhat-marketplace-m6d7s\" (UID: \"15d31f1d-c8b9-4964-9111-3289ef91f71b\") " pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.699518 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d31f1d-c8b9-4964-9111-3289ef91f71b-catalog-content\") pod \"redhat-marketplace-m6d7s\" (UID: \"15d31f1d-c8b9-4964-9111-3289ef91f71b\") " pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.699634 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rp25\" (UniqueName: \"kubernetes.io/projected/15d31f1d-c8b9-4964-9111-3289ef91f71b-kube-api-access-5rp25\") pod \"redhat-marketplace-m6d7s\" (UID: \"15d31f1d-c8b9-4964-9111-3289ef91f71b\") " pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.699806 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d31f1d-c8b9-4964-9111-3289ef91f71b-utilities\") pod \"redhat-marketplace-m6d7s\" (UID: \"15d31f1d-c8b9-4964-9111-3289ef91f71b\") " pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.699903 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d31f1d-c8b9-4964-9111-3289ef91f71b-catalog-content\") pod \"redhat-marketplace-m6d7s\" (UID: \"15d31f1d-c8b9-4964-9111-3289ef91f71b\") " pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.700276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d31f1d-c8b9-4964-9111-3289ef91f71b-utilities\") pod \"redhat-marketplace-m6d7s\" (UID: \"15d31f1d-c8b9-4964-9111-3289ef91f71b\") " pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.731233 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rp25\" (UniqueName: \"kubernetes.io/projected/15d31f1d-c8b9-4964-9111-3289ef91f71b-kube-api-access-5rp25\") pod \"redhat-marketplace-m6d7s\" (UID: \"15d31f1d-c8b9-4964-9111-3289ef91f71b\") " pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:31 crc kubenswrapper[4764]: I0320 15:58:31.768428 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:32 crc kubenswrapper[4764]: I0320 15:58:32.108755 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m6d7s"] Mar 20 15:58:32 crc kubenswrapper[4764]: I0320 15:58:32.446068 4764 generic.go:334] "Generic (PLEG): container finished" podID="15d31f1d-c8b9-4964-9111-3289ef91f71b" containerID="ff8ff58167d655331da0a181ad4c0d58dec43931195224826e36a7f9e0a4e185" exitCode=0 Mar 20 15:58:32 crc kubenswrapper[4764]: I0320 15:58:32.446125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6d7s" event={"ID":"15d31f1d-c8b9-4964-9111-3289ef91f71b","Type":"ContainerDied","Data":"ff8ff58167d655331da0a181ad4c0d58dec43931195224826e36a7f9e0a4e185"} Mar 20 15:58:32 crc kubenswrapper[4764]: I0320 15:58:32.446365 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6d7s" event={"ID":"15d31f1d-c8b9-4964-9111-3289ef91f71b","Type":"ContainerStarted","Data":"2ccfc4e58db29cee97c6765b7ae3b4278127fc9fc9561bae4209754d4c6fff23"} Mar 20 15:58:33 crc kubenswrapper[4764]: I0320 15:58:33.458344 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6d7s" event={"ID":"15d31f1d-c8b9-4964-9111-3289ef91f71b","Type":"ContainerStarted","Data":"ef6464db82015ecb10cf6d1fd62c7281c6dc5ecf822c3c09e495ccb8569ff55c"} Mar 20 15:58:34 crc kubenswrapper[4764]: I0320 15:58:34.473994 4764 generic.go:334] "Generic (PLEG): container finished" podID="15d31f1d-c8b9-4964-9111-3289ef91f71b" containerID="ef6464db82015ecb10cf6d1fd62c7281c6dc5ecf822c3c09e495ccb8569ff55c" exitCode=0 Mar 20 15:58:34 crc kubenswrapper[4764]: I0320 15:58:34.474056 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6d7s" event={"ID":"15d31f1d-c8b9-4964-9111-3289ef91f71b","Type":"ContainerDied","Data":"ef6464db82015ecb10cf6d1fd62c7281c6dc5ecf822c3c09e495ccb8569ff55c"} Mar 20 15:58:35 crc kubenswrapper[4764]: I0320 15:58:35.486826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6d7s" event={"ID":"15d31f1d-c8b9-4964-9111-3289ef91f71b","Type":"ContainerStarted","Data":"f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b"} Mar 20 15:58:35 crc kubenswrapper[4764]: I0320 15:58:35.522190 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m6d7s" podStartSLOduration=2.02649921 podStartE2EDuration="4.522165211s" podCreationTimestamp="2026-03-20 15:58:31 +0000 UTC" firstStartedPulling="2026-03-20 15:58:32.448141016 +0000 UTC m=+4034.064330145" lastFinishedPulling="2026-03-20 15:58:34.943807017 +0000 UTC m=+4036.559996146" observedRunningTime="2026-03-20 15:58:35.510266292 +0000 UTC m=+4037.126455461" watchObservedRunningTime="2026-03-20 15:58:35.522165211 +0000 UTC m=+4037.138354380" Mar 20 15:58:40 crc kubenswrapper[4764]: I0320 15:58:40.553309 4764 generic.go:334] "Generic (PLEG): container finished" podID="8101f58d-2cb2-48f5-bf51-7020f021699f" containerID="41bf334453cfa54e9f75adb3040a4129857833408c37eac2463ab7d30ab28294" exitCode=0 Mar 20 15:58:40 crc kubenswrapper[4764]: I0320 15:58:40.553419 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/crc-debug-hrp8p" event={"ID":"8101f58d-2cb2-48f5-bf51-7020f021699f","Type":"ContainerDied","Data":"41bf334453cfa54e9f75adb3040a4129857833408c37eac2463ab7d30ab28294"} Mar 20 15:58:41 crc kubenswrapper[4764]: I0320 15:58:41.658666 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/crc-debug-hrp8p" Mar 20 15:58:41 crc kubenswrapper[4764]: I0320 15:58:41.674154 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8101f58d-2cb2-48f5-bf51-7020f021699f-host\") pod \"8101f58d-2cb2-48f5-bf51-7020f021699f\" (UID: \"8101f58d-2cb2-48f5-bf51-7020f021699f\") " Mar 20 15:58:41 crc kubenswrapper[4764]: I0320 15:58:41.674344 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6w6j\" (UniqueName: \"kubernetes.io/projected/8101f58d-2cb2-48f5-bf51-7020f021699f-kube-api-access-l6w6j\") pod \"8101f58d-2cb2-48f5-bf51-7020f021699f\" (UID: \"8101f58d-2cb2-48f5-bf51-7020f021699f\") " Mar 20 15:58:41 crc kubenswrapper[4764]: I0320 15:58:41.674553 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8101f58d-2cb2-48f5-bf51-7020f021699f-host" (OuterVolumeSpecName: "host") pod "8101f58d-2cb2-48f5-bf51-7020f021699f" (UID: "8101f58d-2cb2-48f5-bf51-7020f021699f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:58:41 crc kubenswrapper[4764]: I0320 15:58:41.675040 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8101f58d-2cb2-48f5-bf51-7020f021699f-host\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:41 crc kubenswrapper[4764]: I0320 15:58:41.696135 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8101f58d-2cb2-48f5-bf51-7020f021699f-kube-api-access-l6w6j" (OuterVolumeSpecName: "kube-api-access-l6w6j") pod "8101f58d-2cb2-48f5-bf51-7020f021699f" (UID: "8101f58d-2cb2-48f5-bf51-7020f021699f"). InnerVolumeSpecName "kube-api-access-l6w6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:58:41 crc kubenswrapper[4764]: I0320 15:58:41.714804 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gtskb/crc-debug-hrp8p"] Mar 20 15:58:41 crc kubenswrapper[4764]: I0320 15:58:41.724815 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gtskb/crc-debug-hrp8p"] Mar 20 15:58:41 crc kubenswrapper[4764]: I0320 15:58:41.768573 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:41 crc kubenswrapper[4764]: I0320 15:58:41.768651 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:41 crc kubenswrapper[4764]: I0320 15:58:41.776190 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6w6j\" (UniqueName: \"kubernetes.io/projected/8101f58d-2cb2-48f5-bf51-7020f021699f-kube-api-access-l6w6j\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:41 crc kubenswrapper[4764]: I0320 15:58:41.820926 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.575899 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2a725b2b44a132059aa7f4b0954be26c7dab44717e4d2cd2d02806de17ad536" Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.575934 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/crc-debug-hrp8p" Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.631703 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.693666 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m6d7s"] Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.873780 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gtskb/crc-debug-vgcqt"] Mar 20 15:58:42 crc kubenswrapper[4764]: E0320 15:58:42.874341 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8101f58d-2cb2-48f5-bf51-7020f021699f" containerName="container-00" Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.874363 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8101f58d-2cb2-48f5-bf51-7020f021699f" containerName="container-00" Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.874612 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8101f58d-2cb2-48f5-bf51-7020f021699f" containerName="container-00" Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.875459 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/crc-debug-vgcqt" Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.896648 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fc4133c-e06e-4a1a-8fec-49299b84d945-host\") pod \"crc-debug-vgcqt\" (UID: \"0fc4133c-e06e-4a1a-8fec-49299b84d945\") " pod="openshift-must-gather-gtskb/crc-debug-vgcqt" Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.896819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2wrd\" (UniqueName: \"kubernetes.io/projected/0fc4133c-e06e-4a1a-8fec-49299b84d945-kube-api-access-k2wrd\") pod \"crc-debug-vgcqt\" (UID: \"0fc4133c-e06e-4a1a-8fec-49299b84d945\") " pod="openshift-must-gather-gtskb/crc-debug-vgcqt" Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.998318 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fc4133c-e06e-4a1a-8fec-49299b84d945-host\") pod \"crc-debug-vgcqt\" (UID: \"0fc4133c-e06e-4a1a-8fec-49299b84d945\") " pod="openshift-must-gather-gtskb/crc-debug-vgcqt" Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.998408 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fc4133c-e06e-4a1a-8fec-49299b84d945-host\") pod \"crc-debug-vgcqt\" (UID: \"0fc4133c-e06e-4a1a-8fec-49299b84d945\") " pod="openshift-must-gather-gtskb/crc-debug-vgcqt" Mar 20 15:58:42 crc kubenswrapper[4764]: I0320 15:58:42.999016 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2wrd\" (UniqueName: \"kubernetes.io/projected/0fc4133c-e06e-4a1a-8fec-49299b84d945-kube-api-access-k2wrd\") pod \"crc-debug-vgcqt\" (UID: \"0fc4133c-e06e-4a1a-8fec-49299b84d945\") " pod="openshift-must-gather-gtskb/crc-debug-vgcqt" Mar 20 15:58:43 crc kubenswrapper[4764]: I0320 15:58:43.015937 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2wrd\" (UniqueName: \"kubernetes.io/projected/0fc4133c-e06e-4a1a-8fec-49299b84d945-kube-api-access-k2wrd\") pod \"crc-debug-vgcqt\" (UID: \"0fc4133c-e06e-4a1a-8fec-49299b84d945\") " pod="openshift-must-gather-gtskb/crc-debug-vgcqt" Mar 20 15:58:43 crc kubenswrapper[4764]: I0320 15:58:43.137984 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8101f58d-2cb2-48f5-bf51-7020f021699f" path="/var/lib/kubelet/pods/8101f58d-2cb2-48f5-bf51-7020f021699f/volumes" Mar 20 15:58:43 crc kubenswrapper[4764]: I0320 15:58:43.195855 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/crc-debug-vgcqt" Mar 20 15:58:43 crc kubenswrapper[4764]: I0320 15:58:43.587212 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/crc-debug-vgcqt" event={"ID":"0fc4133c-e06e-4a1a-8fec-49299b84d945","Type":"ContainerStarted","Data":"1148f46a2fc0f16672e25a26cef7b7f41995c3d88f61b7bd07ea48022ec50b6e"} Mar 20 15:58:43 crc kubenswrapper[4764]: I0320 15:58:43.587598 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/crc-debug-vgcqt" event={"ID":"0fc4133c-e06e-4a1a-8fec-49299b84d945","Type":"ContainerStarted","Data":"5cd8146e9e7b84f92f724404e166e53a525d51388f7de0f8dac98cc47e8ad190"} Mar 20 15:58:43 crc kubenswrapper[4764]: I0320 15:58:43.620730 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gtskb/crc-debug-vgcqt" podStartSLOduration=1.620707983 podStartE2EDuration="1.620707983s" podCreationTimestamp="2026-03-20 15:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 15:58:43.601072746 +0000 UTC m=+4045.217261875" watchObservedRunningTime="2026-03-20 15:58:43.620707983 +0000 UTC m=+4045.236897122" Mar 20 15:58:44 crc kubenswrapper[4764]: I0320 15:58:44.594288 4764 generic.go:334] "Generic (PLEG): container finished" podID="0fc4133c-e06e-4a1a-8fec-49299b84d945" containerID="1148f46a2fc0f16672e25a26cef7b7f41995c3d88f61b7bd07ea48022ec50b6e" exitCode=0 Mar 20 15:58:44 crc kubenswrapper[4764]: I0320 15:58:44.594373 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/crc-debug-vgcqt" event={"ID":"0fc4133c-e06e-4a1a-8fec-49299b84d945","Type":"ContainerDied","Data":"1148f46a2fc0f16672e25a26cef7b7f41995c3d88f61b7bd07ea48022ec50b6e"} Mar 20 15:58:44 crc kubenswrapper[4764]: I0320 15:58:44.594509 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m6d7s" podUID="15d31f1d-c8b9-4964-9111-3289ef91f71b" containerName="registry-server" containerID="cri-o://f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b" gracePeriod=2 Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.137569 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.236827 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d31f1d-c8b9-4964-9111-3289ef91f71b-catalog-content\") pod \"15d31f1d-c8b9-4964-9111-3289ef91f71b\" (UID: \"15d31f1d-c8b9-4964-9111-3289ef91f71b\") " Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.237096 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d31f1d-c8b9-4964-9111-3289ef91f71b-utilities\") pod \"15d31f1d-c8b9-4964-9111-3289ef91f71b\" (UID: \"15d31f1d-c8b9-4964-9111-3289ef91f71b\") " Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.237150 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rp25\" (UniqueName: \"kubernetes.io/projected/15d31f1d-c8b9-4964-9111-3289ef91f71b-kube-api-access-5rp25\") pod \"15d31f1d-c8b9-4964-9111-3289ef91f71b\" (UID: \"15d31f1d-c8b9-4964-9111-3289ef91f71b\") " Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.238001 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15d31f1d-c8b9-4964-9111-3289ef91f71b-utilities" (OuterVolumeSpecName: "utilities") pod "15d31f1d-c8b9-4964-9111-3289ef91f71b" (UID: "15d31f1d-c8b9-4964-9111-3289ef91f71b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.249604 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d31f1d-c8b9-4964-9111-3289ef91f71b-kube-api-access-5rp25" (OuterVolumeSpecName: "kube-api-access-5rp25") pod "15d31f1d-c8b9-4964-9111-3289ef91f71b" (UID: "15d31f1d-c8b9-4964-9111-3289ef91f71b"). InnerVolumeSpecName "kube-api-access-5rp25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.269131 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15d31f1d-c8b9-4964-9111-3289ef91f71b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15d31f1d-c8b9-4964-9111-3289ef91f71b" (UID: "15d31f1d-c8b9-4964-9111-3289ef91f71b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.339169 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rp25\" (UniqueName: \"kubernetes.io/projected/15d31f1d-c8b9-4964-9111-3289ef91f71b-kube-api-access-5rp25\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.339211 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d31f1d-c8b9-4964-9111-3289ef91f71b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.339225 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d31f1d-c8b9-4964-9111-3289ef91f71b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.604305 4764 generic.go:334] "Generic (PLEG): container finished" podID="15d31f1d-c8b9-4964-9111-3289ef91f71b" containerID="f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b" exitCode=0 Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.604408 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m6d7s" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.604377 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6d7s" event={"ID":"15d31f1d-c8b9-4964-9111-3289ef91f71b","Type":"ContainerDied","Data":"f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b"} Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.604491 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m6d7s" event={"ID":"15d31f1d-c8b9-4964-9111-3289ef91f71b","Type":"ContainerDied","Data":"2ccfc4e58db29cee97c6765b7ae3b4278127fc9fc9561bae4209754d4c6fff23"} Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.604509 4764 scope.go:117] "RemoveContainer" containerID="f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.678428 4764 scope.go:117] "RemoveContainer" containerID="ef6464db82015ecb10cf6d1fd62c7281c6dc5ecf822c3c09e495ccb8569ff55c" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.678847 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/crc-debug-vgcqt" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.696471 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m6d7s"] Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.704028 4764 scope.go:117] "RemoveContainer" containerID="ff8ff58167d655331da0a181ad4c0d58dec43931195224826e36a7f9e0a4e185" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.706049 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m6d7s"] Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.744031 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2wrd\" (UniqueName: \"kubernetes.io/projected/0fc4133c-e06e-4a1a-8fec-49299b84d945-kube-api-access-k2wrd\") pod \"0fc4133c-e06e-4a1a-8fec-49299b84d945\" (UID: \"0fc4133c-e06e-4a1a-8fec-49299b84d945\") " Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.744217 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fc4133c-e06e-4a1a-8fec-49299b84d945-host\") pod \"0fc4133c-e06e-4a1a-8fec-49299b84d945\" (UID: \"0fc4133c-e06e-4a1a-8fec-49299b84d945\") " Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.744314 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fc4133c-e06e-4a1a-8fec-49299b84d945-host" (OuterVolumeSpecName: "host") pod "0fc4133c-e06e-4a1a-8fec-49299b84d945" (UID: "0fc4133c-e06e-4a1a-8fec-49299b84d945"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.744831 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fc4133c-e06e-4a1a-8fec-49299b84d945-host\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.749779 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fc4133c-e06e-4a1a-8fec-49299b84d945-kube-api-access-k2wrd" (OuterVolumeSpecName: "kube-api-access-k2wrd") pod "0fc4133c-e06e-4a1a-8fec-49299b84d945" (UID: "0fc4133c-e06e-4a1a-8fec-49299b84d945"). InnerVolumeSpecName "kube-api-access-k2wrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.807258 4764 scope.go:117] "RemoveContainer" containerID="f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b" Mar 20 15:58:45 crc kubenswrapper[4764]: E0320 15:58:45.807824 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b\": container with ID starting with f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b not found: ID does not exist" containerID="f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.807867 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b"} err="failed to get container status \"f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b\": rpc error: code = NotFound desc = could not find container \"f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b\": container with ID starting with f3adcbde24d78c29e5d53d87195f92327f9b95f8786c43a5180b75df9d21646b not found: ID does not exist" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.807902 4764 scope.go:117] "RemoveContainer" containerID="ef6464db82015ecb10cf6d1fd62c7281c6dc5ecf822c3c09e495ccb8569ff55c" Mar 20 15:58:45 crc kubenswrapper[4764]: E0320 15:58:45.808353 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef6464db82015ecb10cf6d1fd62c7281c6dc5ecf822c3c09e495ccb8569ff55c\": container with ID starting with ef6464db82015ecb10cf6d1fd62c7281c6dc5ecf822c3c09e495ccb8569ff55c not found: ID does not exist" containerID="ef6464db82015ecb10cf6d1fd62c7281c6dc5ecf822c3c09e495ccb8569ff55c" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.808417 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef6464db82015ecb10cf6d1fd62c7281c6dc5ecf822c3c09e495ccb8569ff55c"} err="failed to get container status \"ef6464db82015ecb10cf6d1fd62c7281c6dc5ecf822c3c09e495ccb8569ff55c\": rpc error: code = NotFound desc = could not find container \"ef6464db82015ecb10cf6d1fd62c7281c6dc5ecf822c3c09e495ccb8569ff55c\": container with ID starting with ef6464db82015ecb10cf6d1fd62c7281c6dc5ecf822c3c09e495ccb8569ff55c not found: ID does not exist" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.808451 4764 scope.go:117] "RemoveContainer" containerID="ff8ff58167d655331da0a181ad4c0d58dec43931195224826e36a7f9e0a4e185" Mar 20 15:58:45 crc kubenswrapper[4764]: E0320 15:58:45.808757 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff8ff58167d655331da0a181ad4c0d58dec43931195224826e36a7f9e0a4e185\": container with ID starting with ff8ff58167d655331da0a181ad4c0d58dec43931195224826e36a7f9e0a4e185 not found: ID does not exist" containerID="ff8ff58167d655331da0a181ad4c0d58dec43931195224826e36a7f9e0a4e185" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.808788 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8ff58167d655331da0a181ad4c0d58dec43931195224826e36a7f9e0a4e185"} err="failed to get container status \"ff8ff58167d655331da0a181ad4c0d58dec43931195224826e36a7f9e0a4e185\": rpc error: code = NotFound desc = could not find container \"ff8ff58167d655331da0a181ad4c0d58dec43931195224826e36a7f9e0a4e185\": container with ID starting with ff8ff58167d655331da0a181ad4c0d58dec43931195224826e36a7f9e0a4e185 not found: ID does not exist" Mar 20 15:58:45 crc kubenswrapper[4764]: I0320 15:58:45.845969 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2wrd\" (UniqueName: \"kubernetes.io/projected/0fc4133c-e06e-4a1a-8fec-49299b84d945-kube-api-access-k2wrd\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:46 crc kubenswrapper[4764]: I0320 15:58:46.616343 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/crc-debug-vgcqt" event={"ID":"0fc4133c-e06e-4a1a-8fec-49299b84d945","Type":"ContainerDied","Data":"5cd8146e9e7b84f92f724404e166e53a525d51388f7de0f8dac98cc47e8ad190"} Mar 20 15:58:46 crc kubenswrapper[4764]: I0320 15:58:46.616665 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cd8146e9e7b84f92f724404e166e53a525d51388f7de0f8dac98cc47e8ad190" Mar 20 15:58:46 crc kubenswrapper[4764]: I0320 15:58:46.616454 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/crc-debug-vgcqt" Mar 20 15:58:47 crc kubenswrapper[4764]: I0320 15:58:47.146479 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d31f1d-c8b9-4964-9111-3289ef91f71b" path="/var/lib/kubelet/pods/15d31f1d-c8b9-4964-9111-3289ef91f71b/volumes" Mar 20 15:58:47 crc kubenswrapper[4764]: I0320 15:58:47.539934 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gtskb/crc-debug-vgcqt"] Mar 20 15:58:47 crc kubenswrapper[4764]: I0320 15:58:47.550270 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gtskb/crc-debug-vgcqt"] Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.779107 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gtskb/crc-debug-zbrmd"] Mar 20 15:58:48 crc kubenswrapper[4764]: E0320 15:58:48.779828 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d31f1d-c8b9-4964-9111-3289ef91f71b" containerName="registry-server" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.779843 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d31f1d-c8b9-4964-9111-3289ef91f71b" containerName="registry-server" Mar 20 15:58:48 crc kubenswrapper[4764]: E0320 15:58:48.779867 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d31f1d-c8b9-4964-9111-3289ef91f71b" containerName="extract-content" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.779875 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d31f1d-c8b9-4964-9111-3289ef91f71b" containerName="extract-content" Mar 20 15:58:48 crc kubenswrapper[4764]: E0320 15:58:48.779897 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d31f1d-c8b9-4964-9111-3289ef91f71b" containerName="extract-utilities" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.779907 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d31f1d-c8b9-4964-9111-3289ef91f71b" containerName="extract-utilities" Mar 20 15:58:48 crc kubenswrapper[4764]: E0320 15:58:48.779920 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc4133c-e06e-4a1a-8fec-49299b84d945" containerName="container-00" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.779926 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc4133c-e06e-4a1a-8fec-49299b84d945" containerName="container-00" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.780108 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc4133c-e06e-4a1a-8fec-49299b84d945" containerName="container-00" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.780129 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d31f1d-c8b9-4964-9111-3289ef91f71b" containerName="registry-server" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.780768 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/crc-debug-zbrmd" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.794595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcltt\" (UniqueName: \"kubernetes.io/projected/57078fc2-ca2b-44f5-afe8-ba3b157b2007-kube-api-access-jcltt\") pod \"crc-debug-zbrmd\" (UID: \"57078fc2-ca2b-44f5-afe8-ba3b157b2007\") " pod="openshift-must-gather-gtskb/crc-debug-zbrmd" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.794673 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57078fc2-ca2b-44f5-afe8-ba3b157b2007-host\") pod \"crc-debug-zbrmd\" (UID: \"57078fc2-ca2b-44f5-afe8-ba3b157b2007\") " pod="openshift-must-gather-gtskb/crc-debug-zbrmd" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.896806 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcltt\" (UniqueName: \"kubernetes.io/projected/57078fc2-ca2b-44f5-afe8-ba3b157b2007-kube-api-access-jcltt\") pod \"crc-debug-zbrmd\" (UID: \"57078fc2-ca2b-44f5-afe8-ba3b157b2007\") " pod="openshift-must-gather-gtskb/crc-debug-zbrmd" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.896878 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57078fc2-ca2b-44f5-afe8-ba3b157b2007-host\") pod \"crc-debug-zbrmd\" (UID: \"57078fc2-ca2b-44f5-afe8-ba3b157b2007\") " pod="openshift-must-gather-gtskb/crc-debug-zbrmd" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.897058 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57078fc2-ca2b-44f5-afe8-ba3b157b2007-host\") pod \"crc-debug-zbrmd\" (UID: \"57078fc2-ca2b-44f5-afe8-ba3b157b2007\") " pod="openshift-must-gather-gtskb/crc-debug-zbrmd" Mar 20 15:58:48 crc kubenswrapper[4764]: I0320 15:58:48.917246 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcltt\" (UniqueName: \"kubernetes.io/projected/57078fc2-ca2b-44f5-afe8-ba3b157b2007-kube-api-access-jcltt\") pod \"crc-debug-zbrmd\" (UID: \"57078fc2-ca2b-44f5-afe8-ba3b157b2007\") " pod="openshift-must-gather-gtskb/crc-debug-zbrmd" Mar 20 15:58:49 crc kubenswrapper[4764]: I0320 15:58:49.118567 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/crc-debug-zbrmd" Mar 20 15:58:49 crc kubenswrapper[4764]: I0320 15:58:49.157975 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fc4133c-e06e-4a1a-8fec-49299b84d945" path="/var/lib/kubelet/pods/0fc4133c-e06e-4a1a-8fec-49299b84d945/volumes" Mar 20 15:58:49 crc kubenswrapper[4764]: I0320 15:58:49.652794 4764 generic.go:334] "Generic (PLEG): container finished" podID="57078fc2-ca2b-44f5-afe8-ba3b157b2007" containerID="434680b2774438db0bdbf83b7d468a843fc1f2c1337bdf18f506c070602fb0eb" exitCode=0 Mar 20 15:58:49 crc kubenswrapper[4764]: I0320 15:58:49.652908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/crc-debug-zbrmd" event={"ID":"57078fc2-ca2b-44f5-afe8-ba3b157b2007","Type":"ContainerDied","Data":"434680b2774438db0bdbf83b7d468a843fc1f2c1337bdf18f506c070602fb0eb"} Mar 20 15:58:49 crc kubenswrapper[4764]: I0320 15:58:49.653233 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/crc-debug-zbrmd" event={"ID":"57078fc2-ca2b-44f5-afe8-ba3b157b2007","Type":"ContainerStarted","Data":"db0cea4e729782a47d4f474ecb1f8713751863a8803469c2ebba13c630927ad6"} Mar 20 15:58:49 crc kubenswrapper[4764]: I0320 15:58:49.696511 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gtskb/crc-debug-zbrmd"] Mar 20 15:58:49 crc kubenswrapper[4764]: I0320 15:58:49.707357 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gtskb/crc-debug-zbrmd"] Mar 20 15:58:50 crc kubenswrapper[4764]: I0320 15:58:50.757507 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/crc-debug-zbrmd" Mar 20 15:58:50 crc kubenswrapper[4764]: I0320 15:58:50.837583 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcltt\" (UniqueName: \"kubernetes.io/projected/57078fc2-ca2b-44f5-afe8-ba3b157b2007-kube-api-access-jcltt\") pod \"57078fc2-ca2b-44f5-afe8-ba3b157b2007\" (UID: \"57078fc2-ca2b-44f5-afe8-ba3b157b2007\") " Mar 20 15:58:50 crc kubenswrapper[4764]: I0320 15:58:50.837651 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57078fc2-ca2b-44f5-afe8-ba3b157b2007-host\") pod \"57078fc2-ca2b-44f5-afe8-ba3b157b2007\" (UID: \"57078fc2-ca2b-44f5-afe8-ba3b157b2007\") " Mar 20 15:58:50 crc kubenswrapper[4764]: I0320 15:58:50.837771 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57078fc2-ca2b-44f5-afe8-ba3b157b2007-host" (OuterVolumeSpecName: "host") pod "57078fc2-ca2b-44f5-afe8-ba3b157b2007" (UID: "57078fc2-ca2b-44f5-afe8-ba3b157b2007"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 15:58:50 crc kubenswrapper[4764]: I0320 15:58:50.838221 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57078fc2-ca2b-44f5-afe8-ba3b157b2007-host\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:50 crc kubenswrapper[4764]: I0320 15:58:50.882394 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57078fc2-ca2b-44f5-afe8-ba3b157b2007-kube-api-access-jcltt" (OuterVolumeSpecName: "kube-api-access-jcltt") pod "57078fc2-ca2b-44f5-afe8-ba3b157b2007" (UID: "57078fc2-ca2b-44f5-afe8-ba3b157b2007"). InnerVolumeSpecName "kube-api-access-jcltt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:58:50 crc kubenswrapper[4764]: I0320 15:58:50.940314 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcltt\" (UniqueName: \"kubernetes.io/projected/57078fc2-ca2b-44f5-afe8-ba3b157b2007-kube-api-access-jcltt\") on node \"crc\" DevicePath \"\"" Mar 20 15:58:51 crc kubenswrapper[4764]: I0320 15:58:51.140284 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57078fc2-ca2b-44f5-afe8-ba3b157b2007" path="/var/lib/kubelet/pods/57078fc2-ca2b-44f5-afe8-ba3b157b2007/volumes" Mar 20 15:58:51 crc kubenswrapper[4764]: I0320 15:58:51.669790 4764 scope.go:117] "RemoveContainer" containerID="434680b2774438db0bdbf83b7d468a843fc1f2c1337bdf18f506c070602fb0eb" Mar 20 15:58:51 crc kubenswrapper[4764]: I0320 15:58:51.669863 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/crc-debug-zbrmd" Mar 20 15:59:07 crc kubenswrapper[4764]: I0320 15:59:07.256047 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-58988f9f54-f58q6_8ed91bcd-a582-4c3c-893d-a1f081c657ee/barbican-api/0.log" Mar 20 15:59:07 crc kubenswrapper[4764]: I0320 15:59:07.270568 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-58988f9f54-f58q6_8ed91bcd-a582-4c3c-893d-a1f081c657ee/barbican-api-log/0.log" Mar 20 15:59:07 crc kubenswrapper[4764]: I0320 15:59:07.503285 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-9846b946b-trg4v_a5dda5ed-7a47-43f8-833f-a715aafe6c24/barbican-keystone-listener/0.log" Mar 20 15:59:07 crc kubenswrapper[4764]: I0320 15:59:07.611470 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-58fcdd88fc-hzbh5_b3fc5a72-a60e-4eef-be6a-aa4387e95c45/barbican-worker/0.log" Mar 20 15:59:07 crc kubenswrapper[4764]: I0320 15:59:07.761797 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-58fcdd88fc-hzbh5_b3fc5a72-a60e-4eef-be6a-aa4387e95c45/barbican-worker-log/0.log" Mar 20 15:59:07 crc kubenswrapper[4764]: I0320 15:59:07.777030 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-9846b946b-trg4v_a5dda5ed-7a47-43f8-833f-a715aafe6c24/barbican-keystone-listener-log/0.log" Mar 20 15:59:07 crc kubenswrapper[4764]: I0320 15:59:07.928426 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8m8w4_db572961-158b-4953-aaf1-af5b9e940592/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:08 crc kubenswrapper[4764]: I0320 15:59:08.033314 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2c43fb18-f22b-4423-8241-a6785a42b6e8/ceilometer-central-agent/0.log" Mar 20 15:59:08 crc kubenswrapper[4764]: I0320 15:59:08.123220 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2c43fb18-f22b-4423-8241-a6785a42b6e8/proxy-httpd/0.log" Mar 20 15:59:08 crc kubenswrapper[4764]: I0320 15:59:08.124368 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2c43fb18-f22b-4423-8241-a6785a42b6e8/ceilometer-notification-agent/0.log" Mar 20 15:59:08 crc kubenswrapper[4764]: I0320 15:59:08.164423 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2c43fb18-f22b-4423-8241-a6785a42b6e8/sg-core/0.log" Mar 20 15:59:08 crc kubenswrapper[4764]: I0320 15:59:08.294466 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_cadf6142-f417-45ee-9c6b-92378a298170/cinder-api-log/0.log" Mar 20 15:59:08 crc kubenswrapper[4764]: I0320 15:59:08.354080 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_cadf6142-f417-45ee-9c6b-92378a298170/cinder-api/0.log" Mar 20 15:59:08 crc kubenswrapper[4764]: I0320 15:59:08.563626 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a766ed7e-8243-41e5-b3b1-bc3bdd0e069f/cinder-scheduler/0.log" Mar 20 15:59:08 crc kubenswrapper[4764]: I0320 15:59:08.571469 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a766ed7e-8243-41e5-b3b1-bc3bdd0e069f/probe/0.log" Mar 20 15:59:08 crc kubenswrapper[4764]: I0320 15:59:08.737877 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ftwzw_95634505-7484-4887-973b-a91a632c48d1/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:08 crc kubenswrapper[4764]: I0320 15:59:08.916597 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vpd4g_580850a5-6a99-4108-aeb7-df44798943e8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:08 crc kubenswrapper[4764]: I0320 15:59:08.980421 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-nlkxh_4ab04f17-2a4f-4352-9ef0-8daf105eb96d/init/0.log" Mar 20 15:59:09 crc kubenswrapper[4764]: I0320 15:59:09.192815 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-nlkxh_4ab04f17-2a4f-4352-9ef0-8daf105eb96d/init/0.log" Mar 20 15:59:09 crc kubenswrapper[4764]: I0320 15:59:09.201495 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-nlkxh_4ab04f17-2a4f-4352-9ef0-8daf105eb96d/dnsmasq-dns/0.log" Mar 20 15:59:09 crc kubenswrapper[4764]: I0320 15:59:09.249700 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-ng2vm_297e67fe-a3b9-4b0d-af54-c0d1ba5cb34c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:09 crc kubenswrapper[4764]: I0320 15:59:09.422317 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0def9a2d-c1f7-49a4-ae02-f32e54035e05/glance-httpd/0.log" Mar 20 15:59:09 crc kubenswrapper[4764]: I0320 15:59:09.479296 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0def9a2d-c1f7-49a4-ae02-f32e54035e05/glance-log/0.log" Mar 20 15:59:09 crc kubenswrapper[4764]: I0320 15:59:09.622809 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6abb1d10-04cf-4dc4-81e3-f5db6d8d545f/glance-httpd/0.log" Mar 20 15:59:09 crc kubenswrapper[4764]: I0320 15:59:09.684315 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6abb1d10-04cf-4dc4-81e3-f5db6d8d545f/glance-log/0.log" Mar 20 15:59:09 crc kubenswrapper[4764]: I0320 15:59:09.889897 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-655785589d-5cnb4_cb148fab-0227-4725-af4e-d6dba5740303/horizon/0.log" Mar 20 15:59:10 crc kubenswrapper[4764]: I0320 15:59:10.172579 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gfl4q_5201bbcb-c3f8-4a8d-81be-b4eaf33cc966/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:10 crc kubenswrapper[4764]: I0320 15:59:10.481475 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-655785589d-5cnb4_cb148fab-0227-4725-af4e-d6dba5740303/horizon-log/0.log" Mar 20 15:59:10 crc kubenswrapper[4764]: I0320 15:59:10.599843 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-pmxrf_78596cc8-76e1-4603-b970-b59b504531c3/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:10 crc kubenswrapper[4764]: I0320 15:59:10.733034 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5f33a933-0b09-4945-8d48-449ea849f7e1/kube-state-metrics/0.log" Mar 20 15:59:11 crc kubenswrapper[4764]: I0320 15:59:11.648745 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7c69f8c7f-rlffj_8d9fa689-47c1-464b-82a8-9f047a084ed7/keystone-api/0.log" Mar 20 15:59:11 crc kubenswrapper[4764]: I0320 15:59:11.980513 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56ccbd6c69-4c89q_33b46429-6eed-4b3c-8a29-39b923aad151/neutron-httpd/0.log" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.003914 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-m5lqz_f80dffa9-1ef1-4046-854e-66dcbdff3c9c/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.222758 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-dbzrt_c81117b3-6d55-444b-bed2-9b7eac23bf8e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.422533 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-56ccbd6c69-4c89q_33b46429-6eed-4b3c-8a29-39b923aad151/neutron-api/0.log" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.762124 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5qhqb"] Mar 20 15:59:12 crc kubenswrapper[4764]: E0320 15:59:12.762688 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57078fc2-ca2b-44f5-afe8-ba3b157b2007" containerName="container-00" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.762699 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="57078fc2-ca2b-44f5-afe8-ba3b157b2007" containerName="container-00" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.762881 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="57078fc2-ca2b-44f5-afe8-ba3b157b2007" containerName="container-00" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.764610 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.776416 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qhqb"] Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.874522 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8z7s\" (UniqueName: \"kubernetes.io/projected/c2577211-5d7e-4353-ad44-a33f8cc2736c-kube-api-access-z8z7s\") pod \"redhat-operators-5qhqb\" (UID: \"c2577211-5d7e-4353-ad44-a33f8cc2736c\") " pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.874641 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2577211-5d7e-4353-ad44-a33f8cc2736c-catalog-content\") pod \"redhat-operators-5qhqb\" (UID: \"c2577211-5d7e-4353-ad44-a33f8cc2736c\") " pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.874689 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2577211-5d7e-4353-ad44-a33f8cc2736c-utilities\") pod \"redhat-operators-5qhqb\" (UID: \"c2577211-5d7e-4353-ad44-a33f8cc2736c\") " pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.976606 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8z7s\" (UniqueName: \"kubernetes.io/projected/c2577211-5d7e-4353-ad44-a33f8cc2736c-kube-api-access-z8z7s\") pod \"redhat-operators-5qhqb\" (UID: \"c2577211-5d7e-4353-ad44-a33f8cc2736c\") " pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.976703 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2577211-5d7e-4353-ad44-a33f8cc2736c-catalog-content\") pod \"redhat-operators-5qhqb\" (UID: \"c2577211-5d7e-4353-ad44-a33f8cc2736c\") " pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.976735 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2577211-5d7e-4353-ad44-a33f8cc2736c-utilities\") pod \"redhat-operators-5qhqb\" (UID: \"c2577211-5d7e-4353-ad44-a33f8cc2736c\") " pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.977264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2577211-5d7e-4353-ad44-a33f8cc2736c-utilities\") pod \"redhat-operators-5qhqb\" (UID: \"c2577211-5d7e-4353-ad44-a33f8cc2736c\") " pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.977596 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2577211-5d7e-4353-ad44-a33f8cc2736c-catalog-content\") pod \"redhat-operators-5qhqb\" (UID: \"c2577211-5d7e-4353-ad44-a33f8cc2736c\") " pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:12 crc kubenswrapper[4764]: I0320 15:59:12.985906 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_700deb88-8c5b-470d-a686-664ec01cc1e4/nova-cell0-conductor-conductor/0.log" Mar 20 15:59:13 crc kubenswrapper[4764]: I0320 15:59:13.007910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8z7s\" (UniqueName: \"kubernetes.io/projected/c2577211-5d7e-4353-ad44-a33f8cc2736c-kube-api-access-z8z7s\") pod \"redhat-operators-5qhqb\" (UID: \"c2577211-5d7e-4353-ad44-a33f8cc2736c\") " pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:13 crc kubenswrapper[4764]: I0320 15:59:13.091747 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:13 crc kubenswrapper[4764]: I0320 15:59:13.302942 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_35ff2d6b-dfa9-41fe-9885-f71d494d6bab/nova-cell1-conductor-conductor/0.log" Mar 20 15:59:13 crc kubenswrapper[4764]: I0320 15:59:13.377405 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0956f601-6858-456b-8f63-ea6c5b4aebe1/nova-api-log/0.log" Mar 20 15:59:13 crc kubenswrapper[4764]: I0320 15:59:13.517203 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0956f601-6858-456b-8f63-ea6c5b4aebe1/nova-api-api/0.log" Mar 20 15:59:13 crc kubenswrapper[4764]: I0320 15:59:13.613135 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qhqb"] Mar 20 15:59:13 crc kubenswrapper[4764]: I0320 15:59:13.614132 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_af7cb7f6-9466-48d0-b017-12be46d4f2c6/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 15:59:13 crc kubenswrapper[4764]: I0320 15:59:13.883290 4764 generic.go:334] "Generic (PLEG): container finished" podID="c2577211-5d7e-4353-ad44-a33f8cc2736c" containerID="7209049a0aaa9dc9f5aa320557b2eabeb823e7b0181928787f800ee362b8d6ff" exitCode=0 Mar 20 15:59:13 crc kubenswrapper[4764]: I0320 15:59:13.883344 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qhqb" event={"ID":"c2577211-5d7e-4353-ad44-a33f8cc2736c","Type":"ContainerDied","Data":"7209049a0aaa9dc9f5aa320557b2eabeb823e7b0181928787f800ee362b8d6ff"} Mar 20 15:59:13 crc kubenswrapper[4764]: I0320 15:59:13.883372 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qhqb" event={"ID":"c2577211-5d7e-4353-ad44-a33f8cc2736c","Type":"ContainerStarted","Data":"a481fffafbb9aafd6762b437d2b4e77e02725979e5eda9616a1f7a1b463e3e14"} Mar 20 15:59:14 crc kubenswrapper[4764]: I0320 15:59:14.083948 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_daebb9be-71bf-47d6-9e0c-def343511d34/nova-metadata-log/0.log" Mar 20 15:59:14 crc kubenswrapper[4764]: I0320 15:59:14.355772 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-pxgf2_4381ca4e-12fa-4b60-8bbb-4b4633ad1c1d/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:14 crc kubenswrapper[4764]: I0320 15:59:14.399528 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_daebb9be-71bf-47d6-9e0c-def343511d34/nova-metadata-metadata/0.log" Mar 20 15:59:14 crc kubenswrapper[4764]: I0320 15:59:14.911224 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qhqb" event={"ID":"c2577211-5d7e-4353-ad44-a33f8cc2736c","Type":"ContainerStarted","Data":"6f6add6d5292f616381c964ee4fff0775bcb9f762f78bee6e9cb4d2d0b7828be"} Mar 20 15:59:14 crc kubenswrapper[4764]: I0320 15:59:14.945863 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ce3534c6-4831-4cf1-9c4a-99bf3e934022/mysql-bootstrap/0.log" Mar 20 15:59:15 crc kubenswrapper[4764]: I0320 15:59:15.037506 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ce3534c6-4831-4cf1-9c4a-99bf3e934022/mysql-bootstrap/0.log" Mar 20 15:59:15 crc kubenswrapper[4764]: I0320 15:59:15.078275 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ce3534c6-4831-4cf1-9c4a-99bf3e934022/galera/0.log" Mar 20 15:59:15 crc kubenswrapper[4764]: I0320 15:59:15.092309 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bf67384d-ae1b-4966-95af-d73c0e45d7a1/nova-scheduler-scheduler/0.log" Mar 20 15:59:15 crc kubenswrapper[4764]: I0320 15:59:15.249226 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_02550cd6-b0c3-4f74-a6d2-c9348fc00cc5/mysql-bootstrap/0.log" Mar 20 15:59:15 crc kubenswrapper[4764]: I0320 15:59:15.480488 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_02550cd6-b0c3-4f74-a6d2-c9348fc00cc5/galera/0.log" Mar 20 15:59:15 crc kubenswrapper[4764]: I0320 15:59:15.508262 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_be96e3e8-4879-473f-b5e2-34af484ddfcc/openstackclient/0.log" Mar 20 15:59:15 crc kubenswrapper[4764]: I0320 15:59:15.559721 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_02550cd6-b0c3-4f74-a6d2-c9348fc00cc5/mysql-bootstrap/0.log" Mar 20 15:59:15 crc kubenswrapper[4764]: I0320 15:59:15.688805 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xlzb7_8a5cf141-3541-4806-83e8-3338f7c2865c/openstack-network-exporter/0.log" Mar 20 15:59:15 crc kubenswrapper[4764]: I0320 15:59:15.919842 4764 generic.go:334] "Generic (PLEG): container finished" podID="c2577211-5d7e-4353-ad44-a33f8cc2736c" containerID="6f6add6d5292f616381c964ee4fff0775bcb9f762f78bee6e9cb4d2d0b7828be" exitCode=0 Mar 20 15:59:15 crc kubenswrapper[4764]: I0320 15:59:15.919882 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qhqb" event={"ID":"c2577211-5d7e-4353-ad44-a33f8cc2736c","Type":"ContainerDied","Data":"6f6add6d5292f616381c964ee4fff0775bcb9f762f78bee6e9cb4d2d0b7828be"} Mar 20 15:59:16 crc kubenswrapper[4764]: I0320 15:59:16.469890 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kb2ph_bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a/ovsdb-server-init/0.log" Mar 20 15:59:16 crc kubenswrapper[4764]: I0320 15:59:16.743643 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kb2ph_bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a/ovsdb-server/0.log" Mar 20 15:59:16 crc kubenswrapper[4764]: I0320 15:59:16.760853 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kb2ph_bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a/ovs-vswitchd/0.log" Mar 20 15:59:16 crc kubenswrapper[4764]: I0320 15:59:16.775142 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-kb2ph_bf87a2bc-c9cc-4831-851a-ffcfca0e7d9a/ovsdb-server-init/0.log" Mar 20 15:59:16 crc kubenswrapper[4764]: I0320 15:59:16.936587 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qhqb" event={"ID":"c2577211-5d7e-4353-ad44-a33f8cc2736c","Type":"ContainerStarted","Data":"7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700"} Mar 20 15:59:16 crc kubenswrapper[4764]: I0320 15:59:16.965428 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5qhqb" podStartSLOduration=2.444109888 podStartE2EDuration="4.965404992s" podCreationTimestamp="2026-03-20 15:59:12 +0000 UTC" firstStartedPulling="2026-03-20 15:59:13.88595494 +0000 UTC m=+4075.502144069" lastFinishedPulling="2026-03-20 15:59:16.407250044 +0000 UTC m=+4078.023439173" observedRunningTime="2026-03-20 15:59:16.952968438 +0000 UTC m=+4078.569157567" watchObservedRunningTime="2026-03-20 15:59:16.965404992 +0000 UTC m=+4078.581594121" Mar 20 15:59:17 crc kubenswrapper[4764]: I0320 15:59:17.156153 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-v2tgk_28a506a3-463d-4bc4-ab93-2e8201878e60/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:17 crc kubenswrapper[4764]: I0320 15:59:17.291550 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6a34707a-c81b-4987-b7cb-59ad7b8fa2ef/ovn-northd/0.log" Mar 20 15:59:17 crc kubenswrapper[4764]: I0320 15:59:17.326642 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6a34707a-c81b-4987-b7cb-59ad7b8fa2ef/openstack-network-exporter/0.log" Mar 20 15:59:17 crc kubenswrapper[4764]: I0320 15:59:17.512653 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4501c31c-d4db-4881-b791-c4a004cab3d2/openstack-network-exporter/0.log" Mar 20 15:59:17 crc kubenswrapper[4764]: I0320 15:59:17.622278 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4501c31c-d4db-4881-b791-c4a004cab3d2/ovsdbserver-nb/0.log" Mar 20 15:59:17 crc kubenswrapper[4764]: I0320 15:59:17.779655 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_70c53684-1a46-497b-8b8d-ab4e90fbe6c2/openstack-network-exporter/0.log" Mar 20 15:59:17 crc kubenswrapper[4764]: I0320 15:59:17.962760 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_70c53684-1a46-497b-8b8d-ab4e90fbe6c2/ovsdbserver-sb/0.log" Mar 20 15:59:18 crc kubenswrapper[4764]: I0320 15:59:18.333890 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64c4774c48-rhcrn_2f95b4a1-bd62-4e6a-8968-2de23aa0f532/placement-api/0.log" Mar 20 15:59:18 crc kubenswrapper[4764]: I0320 15:59:18.523675 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64c4774c48-rhcrn_2f95b4a1-bd62-4e6a-8968-2de23aa0f532/placement-log/0.log" Mar 20 15:59:18 crc kubenswrapper[4764]: I0320 15:59:18.577242 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ncp4w_cafe34ef-dca1-4fe3-a0a7-7bb9c2dc050d/ovn-controller/0.log" Mar 20 15:59:18 crc kubenswrapper[4764]: I0320 15:59:18.597416 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eddb3def-0cd3-4d16-954a-dff2909e681f/setup-container/0.log" Mar 20 15:59:18 crc kubenswrapper[4764]: I0320 15:59:18.727298 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eddb3def-0cd3-4d16-954a-dff2909e681f/setup-container/0.log" Mar 20 15:59:18 crc kubenswrapper[4764]: I0320 15:59:18.809670 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f/setup-container/0.log" Mar 20 15:59:18 crc kubenswrapper[4764]: I0320 15:59:18.910564 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_eddb3def-0cd3-4d16-954a-dff2909e681f/rabbitmq/0.log" Mar 20 15:59:19 crc kubenswrapper[4764]: I0320 15:59:19.079574 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f/setup-container/0.log" Mar 20 15:59:19 crc kubenswrapper[4764]: I0320 15:59:19.164836 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c10d2f6e-a52b-4ee3-ae77-7ebfa6120c1f/rabbitmq/0.log" Mar 20 15:59:19 crc kubenswrapper[4764]: I0320 15:59:19.312521 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-s2sj6_0b67f53a-96b2-487b-b68e-60560ba40a02/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:19 crc kubenswrapper[4764]: I0320 15:59:19.412611 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-7n5g5_09e73d7f-44ca-4b1f-bf4c-aa4793441e30/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:19 crc kubenswrapper[4764]: I0320 15:59:19.653029 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-xc89b_2bc8745c-8628-48f5-9562-1de3f0c30286/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:19 crc kubenswrapper[4764]: I0320 15:59:19.702026 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-2x6qg_0e86c9d2-c700-4e8a-aec2-d808fe36be79/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:19 crc kubenswrapper[4764]: I0320 15:59:19.942680 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-xxvwp_6921f642-8ca4-4d60-bd80-9e5db110986f/ssh-known-hosts-edpm-deployment/0.log" Mar 20 15:59:20 crc kubenswrapper[4764]: I0320 15:59:20.113362 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7957c5bfd5-lfmpr_884c1077-0801-4570-b8ef-195767d65d2c/proxy-httpd/0.log" Mar 20 15:59:20 crc kubenswrapper[4764]: I0320 15:59:20.163534 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7957c5bfd5-lfmpr_884c1077-0801-4570-b8ef-195767d65d2c/proxy-server/0.log" Mar 20 15:59:20 crc kubenswrapper[4764]: I0320 15:59:20.298243 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-fqbvt_14bf0f11-9be0-4cd6-9395-a9c2d4e12706/swift-ring-rebalance/0.log" Mar 20 15:59:20 crc kubenswrapper[4764]: I0320 15:59:20.483956 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/account-auditor/0.log" Mar 20 15:59:20 crc kubenswrapper[4764]: I0320 15:59:20.492720 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/account-reaper/0.log" Mar 20 15:59:20 crc kubenswrapper[4764]: I0320 15:59:20.541964 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/account-replicator/0.log" Mar 20 15:59:20 crc kubenswrapper[4764]: I0320 15:59:20.750664 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/account-server/0.log" Mar 20 15:59:20 crc kubenswrapper[4764]: I0320 15:59:20.827022 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/container-replicator/0.log" Mar 20 15:59:20 crc kubenswrapper[4764]: I0320 15:59:20.842398 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/container-auditor/0.log" Mar 20 15:59:20 crc kubenswrapper[4764]: I0320 15:59:20.894134 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/container-server/0.log" Mar 20 15:59:20 crc kubenswrapper[4764]: I0320 15:59:20.971873 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/container-updater/0.log" Mar 20 15:59:21 crc kubenswrapper[4764]: I0320 15:59:21.140834 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/object-replicator/0.log" Mar 20 15:59:21 crc kubenswrapper[4764]: I0320 15:59:21.151747 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/object-expirer/0.log" Mar 20 15:59:21 crc kubenswrapper[4764]: I0320 15:59:21.195238 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/object-auditor/0.log" Mar 20 15:59:21 crc kubenswrapper[4764]: I0320 15:59:21.232857 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/object-server/0.log" Mar 20 15:59:21 crc kubenswrapper[4764]: I0320 15:59:21.343339 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/rsync/0.log" Mar 20 15:59:21 crc kubenswrapper[4764]: I0320 15:59:21.387736 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/object-updater/0.log" Mar 20 15:59:21 crc kubenswrapper[4764]: I0320 15:59:21.550845 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6a1376b-fe31-4d29-b9de-c51d7b1c5ea0/swift-recon-cron/0.log" Mar 20 15:59:21 crc kubenswrapper[4764]: I0320 15:59:21.884675 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_8fc64043-bf77-406c-930f-9663e608f2c9/test-operator-logs-container/0.log" Mar 20 15:59:22 crc kubenswrapper[4764]: I0320 15:59:22.058612 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qftg7_caadbad0-3673-4b77-9805-5d50cf754588/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:22 crc kubenswrapper[4764]: I0320 15:59:22.113204 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-h9bq6_e14aaa5f-0501-4ce2-b63a-08fde03ed17a/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 15:59:22 crc kubenswrapper[4764]: I0320 15:59:22.590708 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_2f991298-5b9e-4568-b8b0-24d9d1978a6d/tempest-tests-tempest-tests-runner/0.log" Mar 20 15:59:23 crc kubenswrapper[4764]: I0320 15:59:23.092396 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:23 crc kubenswrapper[4764]: I0320 15:59:23.092434 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:23 crc kubenswrapper[4764]: I0320 15:59:23.433159 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:24 crc kubenswrapper[4764]: I0320 15:59:24.065024 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:24 crc kubenswrapper[4764]: I0320 15:59:24.122227 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qhqb"] Mar 20 15:59:26 crc kubenswrapper[4764]: I0320 15:59:26.021294 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5qhqb" podUID="c2577211-5d7e-4353-ad44-a33f8cc2736c" containerName="registry-server" containerID="cri-o://7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700" gracePeriod=2 Mar 20 15:59:26 crc kubenswrapper[4764]: I0320 15:59:26.527393 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:26 crc kubenswrapper[4764]: I0320 15:59:26.547473 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8z7s\" (UniqueName: \"kubernetes.io/projected/c2577211-5d7e-4353-ad44-a33f8cc2736c-kube-api-access-z8z7s\") pod \"c2577211-5d7e-4353-ad44-a33f8cc2736c\" (UID: \"c2577211-5d7e-4353-ad44-a33f8cc2736c\") " Mar 20 15:59:26 crc kubenswrapper[4764]: I0320 15:59:26.547541 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2577211-5d7e-4353-ad44-a33f8cc2736c-utilities\") pod \"c2577211-5d7e-4353-ad44-a33f8cc2736c\" (UID: \"c2577211-5d7e-4353-ad44-a33f8cc2736c\") " Mar 20 15:59:26 crc kubenswrapper[4764]: I0320 15:59:26.547755 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2577211-5d7e-4353-ad44-a33f8cc2736c-catalog-content\") pod \"c2577211-5d7e-4353-ad44-a33f8cc2736c\" (UID: \"c2577211-5d7e-4353-ad44-a33f8cc2736c\") " Mar 20 15:59:26 crc kubenswrapper[4764]: I0320 15:59:26.548608 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2577211-5d7e-4353-ad44-a33f8cc2736c-utilities" (OuterVolumeSpecName: "utilities") pod "c2577211-5d7e-4353-ad44-a33f8cc2736c" (UID: "c2577211-5d7e-4353-ad44-a33f8cc2736c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:59:26 crc kubenswrapper[4764]: I0320 15:59:26.570176 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2577211-5d7e-4353-ad44-a33f8cc2736c-kube-api-access-z8z7s" (OuterVolumeSpecName: "kube-api-access-z8z7s") pod "c2577211-5d7e-4353-ad44-a33f8cc2736c" (UID: "c2577211-5d7e-4353-ad44-a33f8cc2736c"). InnerVolumeSpecName "kube-api-access-z8z7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:59:26 crc kubenswrapper[4764]: I0320 15:59:26.649044 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8z7s\" (UniqueName: \"kubernetes.io/projected/c2577211-5d7e-4353-ad44-a33f8cc2736c-kube-api-access-z8z7s\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:26 crc kubenswrapper[4764]: I0320 15:59:26.649075 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2577211-5d7e-4353-ad44-a33f8cc2736c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:26 crc kubenswrapper[4764]: I0320 15:59:26.709159 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2577211-5d7e-4353-ad44-a33f8cc2736c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2577211-5d7e-4353-ad44-a33f8cc2736c" (UID: "c2577211-5d7e-4353-ad44-a33f8cc2736c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 15:59:26 crc kubenswrapper[4764]: I0320 15:59:26.750338 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2577211-5d7e-4353-ad44-a33f8cc2736c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.032945 4764 generic.go:334] "Generic (PLEG): container finished" podID="c2577211-5d7e-4353-ad44-a33f8cc2736c" containerID="7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700" exitCode=0 Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.032983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qhqb" event={"ID":"c2577211-5d7e-4353-ad44-a33f8cc2736c","Type":"ContainerDied","Data":"7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700"} Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.033010 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qhqb" event={"ID":"c2577211-5d7e-4353-ad44-a33f8cc2736c","Type":"ContainerDied","Data":"a481fffafbb9aafd6762b437d2b4e77e02725979e5eda9616a1f7a1b463e3e14"} Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.033030 4764 scope.go:117] "RemoveContainer" containerID="7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700" Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.033033 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qhqb" Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.062569 4764 scope.go:117] "RemoveContainer" containerID="6f6add6d5292f616381c964ee4fff0775bcb9f762f78bee6e9cb4d2d0b7828be" Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.088115 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qhqb"] Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.092656 4764 scope.go:117] "RemoveContainer" containerID="7209049a0aaa9dc9f5aa320557b2eabeb823e7b0181928787f800ee362b8d6ff" Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.103087 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5qhqb"] Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.138708 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2577211-5d7e-4353-ad44-a33f8cc2736c" path="/var/lib/kubelet/pods/c2577211-5d7e-4353-ad44-a33f8cc2736c/volumes" Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.158180 4764 scope.go:117] "RemoveContainer" containerID="7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700" Mar 20 15:59:27 crc kubenswrapper[4764]: E0320 15:59:27.158998 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700\": container with ID starting with 7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700 not found: ID does not exist" containerID="7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700" Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.159034 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700"} err="failed to get container status \"7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700\": rpc error: code = NotFound desc = could not find container \"7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700\": container with ID starting with 7674a0029e180c50fdb4a1726d0ee3a677f1811009f167e00cfa4e9ba9518700 not found: ID does not exist" Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.159058 4764 scope.go:117] "RemoveContainer" containerID="6f6add6d5292f616381c964ee4fff0775bcb9f762f78bee6e9cb4d2d0b7828be" Mar 20 15:59:27 crc kubenswrapper[4764]: E0320 15:59:27.159465 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f6add6d5292f616381c964ee4fff0775bcb9f762f78bee6e9cb4d2d0b7828be\": container with ID starting with 6f6add6d5292f616381c964ee4fff0775bcb9f762f78bee6e9cb4d2d0b7828be not found: ID does not exist" containerID="6f6add6d5292f616381c964ee4fff0775bcb9f762f78bee6e9cb4d2d0b7828be" Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.159488 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f6add6d5292f616381c964ee4fff0775bcb9f762f78bee6e9cb4d2d0b7828be"} err="failed to get container status \"6f6add6d5292f616381c964ee4fff0775bcb9f762f78bee6e9cb4d2d0b7828be\": rpc error: code = NotFound desc = could not find container \"6f6add6d5292f616381c964ee4fff0775bcb9f762f78bee6e9cb4d2d0b7828be\": container with ID starting with 6f6add6d5292f616381c964ee4fff0775bcb9f762f78bee6e9cb4d2d0b7828be not found: ID does not exist" Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.159502 4764 scope.go:117] "RemoveContainer" containerID="7209049a0aaa9dc9f5aa320557b2eabeb823e7b0181928787f800ee362b8d6ff" Mar 20 15:59:27 crc kubenswrapper[4764]: E0320 15:59:27.159925 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7209049a0aaa9dc9f5aa320557b2eabeb823e7b0181928787f800ee362b8d6ff\": container with ID starting with 7209049a0aaa9dc9f5aa320557b2eabeb823e7b0181928787f800ee362b8d6ff not found: ID does not exist" containerID="7209049a0aaa9dc9f5aa320557b2eabeb823e7b0181928787f800ee362b8d6ff" Mar 20 15:59:27 crc kubenswrapper[4764]: I0320 15:59:27.159971 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7209049a0aaa9dc9f5aa320557b2eabeb823e7b0181928787f800ee362b8d6ff"} err="failed to get container status \"7209049a0aaa9dc9f5aa320557b2eabeb823e7b0181928787f800ee362b8d6ff\": rpc error: code = NotFound desc = could not find container \"7209049a0aaa9dc9f5aa320557b2eabeb823e7b0181928787f800ee362b8d6ff\": container with ID starting with 7209049a0aaa9dc9f5aa320557b2eabeb823e7b0181928787f800ee362b8d6ff not found: ID does not exist" Mar 20 15:59:30 crc kubenswrapper[4764]: I0320 15:59:30.144095 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_161fc524-f747-4350-ac9c-0670b2a338bb/memcached/0.log" Mar 20 15:59:38 crc kubenswrapper[4764]: I0320 15:59:38.444159 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 15:59:38 crc kubenswrapper[4764]: I0320 15:59:38.444798 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 15:59:51 crc kubenswrapper[4764]: I0320 15:59:51.780167 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l_2c0a59ef-c6d3-40b8-8134-4223ae9d69fe/util/0.log" Mar 20 15:59:51 crc kubenswrapper[4764]: I0320 15:59:51.963568 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l_2c0a59ef-c6d3-40b8-8134-4223ae9d69fe/util/0.log" Mar 20 15:59:51 crc kubenswrapper[4764]: I0320 15:59:51.963940 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l_2c0a59ef-c6d3-40b8-8134-4223ae9d69fe/pull/0.log" Mar 20 15:59:51 crc kubenswrapper[4764]: I0320 15:59:51.978663 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l_2c0a59ef-c6d3-40b8-8134-4223ae9d69fe/pull/0.log" Mar 20 15:59:52 crc kubenswrapper[4764]: I0320 15:59:52.681789 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l_2c0a59ef-c6d3-40b8-8134-4223ae9d69fe/pull/0.log" Mar 20 15:59:52 crc kubenswrapper[4764]: I0320 15:59:52.728649 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l_2c0a59ef-c6d3-40b8-8134-4223ae9d69fe/util/0.log" Mar 20 15:59:52 crc kubenswrapper[4764]: I0320 15:59:52.742333 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a9056c04d6e81fdf933671501db94ba13de4a4e523a1b20144a3796916tg66l_2c0a59ef-c6d3-40b8-8134-4223ae9d69fe/extract/0.log" Mar 20 15:59:53 crc kubenswrapper[4764]: I0320 15:59:53.255566 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-2tslm_639f5d85-78ac-409d-b2ed-b809cb59bfc5/manager/0.log" Mar 20 15:59:53 crc kubenswrapper[4764]: I0320 15:59:53.354817 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-xrgp9_90b3a567-d460-4c6c-ba32-aaa43faf3add/manager/0.log" Mar 20 15:59:53 crc kubenswrapper[4764]: I0320 15:59:53.591840 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-gh5rq_7936786e-744e-4fc9-bdd2-3e9ee6c7d3bf/manager/0.log" Mar 20 15:59:53 crc kubenswrapper[4764]: I0320 15:59:53.677016 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-5nkz6_3b81c755-123b-4caa-bc28-43ce8b672547/manager/0.log" Mar 20 15:59:53 crc kubenswrapper[4764]: I0320 15:59:53.934598 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-bh9n8_f38f9570-e767-488b-b3d1-97da8b4afa56/manager/0.log" Mar 20 15:59:54 crc kubenswrapper[4764]: I0320 15:59:54.206044 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-8f4f4_8895943a-8d4d-471a-a0db-44eb2e832119/manager/0.log" Mar 20 15:59:54 crc kubenswrapper[4764]: I0320 15:59:54.376357 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-kj5l7_0d25f0a7-3740-4f16-96f3-63b0f587f0a0/manager/0.log" Mar 20 15:59:54 crc kubenswrapper[4764]: I0320 15:59:54.536921 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-dh2nv_c8815a47-3a15-4fcb-a8eb-c72f767b30f0/manager/0.log" Mar 20 15:59:54 crc kubenswrapper[4764]: I0320 15:59:54.544760 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-q6wd6_46418e4e-189e-4cc1-8df8-343f53697f68/manager/0.log" Mar 20 15:59:54 crc kubenswrapper[4764]: I0320 15:59:54.700198 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-hkvw2_238654a5-1849-4f9f-9496-a8e796655b37/manager/0.log" Mar 20 15:59:54 crc kubenswrapper[4764]: I0320 15:59:54.793092 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-sdqrr_8a040614-ef32-4b0d-a5ae-a3336d26bc71/manager/0.log" Mar 20 15:59:54 crc kubenswrapper[4764]: I0320 15:59:54.937516 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-24cf9_0410685f-0ab6-43db-9831-f6cd0b0e7f6f/manager/0.log" Mar 20 15:59:55 crc kubenswrapper[4764]: I0320 15:59:55.217545 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-pwjdt_5f0943ff-7d84-467b-9d22-53fdacb1b054/manager/0.log" Mar 20 15:59:55 crc kubenswrapper[4764]: I0320 15:59:55.264823 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-7lxwm_f111a23d-c6f2-4ca4-9434-aa20eafdf979/manager/0.log" Mar 20 15:59:55 crc kubenswrapper[4764]: I0320 15:59:55.446172 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-z2q6p_6efa9b2a-452d-42b2-bb4c-fe8d41747d3f/manager/0.log" Mar 20 15:59:55 crc kubenswrapper[4764]: I0320 15:59:55.581513 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-74bc6f6bf8-hk7p7_53c82166-2b9f-4340-8eb1-c95569faca61/operator/0.log" Mar 20 15:59:55 crc kubenswrapper[4764]: I0320 15:59:55.684606 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-slfdz_168e96d7-6450-4a0f-95ba-d9d42d7ab187/registry-server/0.log" Mar 20 15:59:55 crc kubenswrapper[4764]: I0320 15:59:55.877719 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-555bbdc4dc-c94kc_73b3ae86-f483-4083-8eb1-9925cac6b796/manager/0.log" Mar 20 15:59:55 crc kubenswrapper[4764]: I0320 15:59:55.995183 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-d2jjj_5783d021-3258-4703-b4d8-0989af31ce65/manager/0.log" Mar 20 15:59:56 crc kubenswrapper[4764]: I0320 15:59:56.187401 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-mqh5f_0254dff8-49b6-49a6-a79b-c366bd0f247e/manager/0.log" Mar 20 15:59:56 crc kubenswrapper[4764]: I0320 15:59:56.293188 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-jjrw6_605fb18f-f047-4e83-bdf7-24556aec2ed8/operator/0.log" Mar 20 15:59:56 crc kubenswrapper[4764]: I0320 15:59:56.547626 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-skvs9_c9009e06-5771-43ec-a800-c79012d6c18e/manager/0.log" Mar 20 15:59:56 crc kubenswrapper[4764]: I0320 15:59:56.564126 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-wpnxh_d5b94bce-37f0-4816-bfa7-6947c258f201/manager/0.log" Mar 20 15:59:56 crc kubenswrapper[4764]: I0320 15:59:56.678364 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-62lwt_7232f0f7-0987-43fc-ab04-eaf226617757/manager/0.log" Mar 20 15:59:56 crc kubenswrapper[4764]: I0320 15:59:56.900997 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6f477c5b6b-tqfm2_bce3e053-25d9-4eb2-933f-3e40a6ae89ab/manager/0.log" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.157084 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z"] Mar 20 16:00:00 crc kubenswrapper[4764]: E0320 16:00:00.158152 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2577211-5d7e-4353-ad44-a33f8cc2736c" containerName="extract-content" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.158169 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2577211-5d7e-4353-ad44-a33f8cc2736c" containerName="extract-content" Mar 20 16:00:00 crc kubenswrapper[4764]: E0320 16:00:00.158201 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2577211-5d7e-4353-ad44-a33f8cc2736c" containerName="registry-server" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.158207 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2577211-5d7e-4353-ad44-a33f8cc2736c" containerName="registry-server" Mar 20 16:00:00 crc kubenswrapper[4764]: E0320 16:00:00.158222 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2577211-5d7e-4353-ad44-a33f8cc2736c" containerName="extract-utilities" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.158228 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2577211-5d7e-4353-ad44-a33f8cc2736c" containerName="extract-utilities" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.158448 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2577211-5d7e-4353-ad44-a33f8cc2736c" containerName="registry-server" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.159064 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.171191 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.173751 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.176249 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z"] Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.265755 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567040-x4qfd"] Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.266946 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567040-x4qfd" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.268997 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.269441 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.269498 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.279715 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567040-x4qfd"] Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.299719 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn62f\" (UniqueName: \"kubernetes.io/projected/ed49413b-1501-46fd-a4df-7e17198a21d7-kube-api-access-nn62f\") pod \"auto-csr-approver-29567040-x4qfd\" (UID: \"ed49413b-1501-46fd-a4df-7e17198a21d7\") " pod="openshift-infra/auto-csr-approver-29567040-x4qfd" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.299765 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a80a95a-b770-4111-9e35-f00418c9f1bd-config-volume\") pod \"collect-profiles-29567040-6pw5z\" (UID: \"2a80a95a-b770-4111-9e35-f00418c9f1bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.299802 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jldq8\" (UniqueName: \"kubernetes.io/projected/2a80a95a-b770-4111-9e35-f00418c9f1bd-kube-api-access-jldq8\") pod \"collect-profiles-29567040-6pw5z\" (UID: \"2a80a95a-b770-4111-9e35-f00418c9f1bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.299828 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a80a95a-b770-4111-9e35-f00418c9f1bd-secret-volume\") pod \"collect-profiles-29567040-6pw5z\" (UID: \"2a80a95a-b770-4111-9e35-f00418c9f1bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.401210 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jldq8\" (UniqueName: \"kubernetes.io/projected/2a80a95a-b770-4111-9e35-f00418c9f1bd-kube-api-access-jldq8\") pod \"collect-profiles-29567040-6pw5z\" (UID: \"2a80a95a-b770-4111-9e35-f00418c9f1bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.401252 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a80a95a-b770-4111-9e35-f00418c9f1bd-secret-volume\") pod \"collect-profiles-29567040-6pw5z\" (UID: \"2a80a95a-b770-4111-9e35-f00418c9f1bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.401397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn62f\" (UniqueName: \"kubernetes.io/projected/ed49413b-1501-46fd-a4df-7e17198a21d7-kube-api-access-nn62f\") pod \"auto-csr-approver-29567040-x4qfd\" (UID: \"ed49413b-1501-46fd-a4df-7e17198a21d7\") " pod="openshift-infra/auto-csr-approver-29567040-x4qfd" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.401432 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a80a95a-b770-4111-9e35-f00418c9f1bd-config-volume\") pod \"collect-profiles-29567040-6pw5z\" (UID: \"2a80a95a-b770-4111-9e35-f00418c9f1bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.402357 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a80a95a-b770-4111-9e35-f00418c9f1bd-config-volume\") pod \"collect-profiles-29567040-6pw5z\" (UID: \"2a80a95a-b770-4111-9e35-f00418c9f1bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.410369 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a80a95a-b770-4111-9e35-f00418c9f1bd-secret-volume\") pod \"collect-profiles-29567040-6pw5z\" (UID: \"2a80a95a-b770-4111-9e35-f00418c9f1bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.427021 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jldq8\" (UniqueName: \"kubernetes.io/projected/2a80a95a-b770-4111-9e35-f00418c9f1bd-kube-api-access-jldq8\") pod \"collect-profiles-29567040-6pw5z\" (UID: \"2a80a95a-b770-4111-9e35-f00418c9f1bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.429998 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn62f\" (UniqueName: \"kubernetes.io/projected/ed49413b-1501-46fd-a4df-7e17198a21d7-kube-api-access-nn62f\") pod \"auto-csr-approver-29567040-x4qfd\" (UID: \"ed49413b-1501-46fd-a4df-7e17198a21d7\") " pod="openshift-infra/auto-csr-approver-29567040-x4qfd" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.493109 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:00 crc kubenswrapper[4764]: I0320 16:00:00.598646 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567040-x4qfd" Mar 20 16:00:01 crc kubenswrapper[4764]: I0320 16:00:01.009693 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z"] Mar 20 16:00:01 crc kubenswrapper[4764]: I0320 16:00:01.153683 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567040-x4qfd"] Mar 20 16:00:01 crc kubenswrapper[4764]: W0320 16:00:01.155032 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded49413b_1501_46fd_a4df_7e17198a21d7.slice/crio-cb2b6900f4c321bd600e0df73d967633ab337d340ee863c43c5d27ac184b13e6 WatchSource:0}: Error finding container cb2b6900f4c321bd600e0df73d967633ab337d340ee863c43c5d27ac184b13e6: Status 404 returned error can't find the container with id cb2b6900f4c321bd600e0df73d967633ab337d340ee863c43c5d27ac184b13e6 Mar 20 16:00:01 crc kubenswrapper[4764]: I0320 16:00:01.382343 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" event={"ID":"2a80a95a-b770-4111-9e35-f00418c9f1bd","Type":"ContainerStarted","Data":"6cd1ff3eea1eeb0f64f9d4f40621cae0ebe6d07416d7a0615f4f298e8b017b94"} Mar 20 16:00:01 crc kubenswrapper[4764]: I0320 16:00:01.382427 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" event={"ID":"2a80a95a-b770-4111-9e35-f00418c9f1bd","Type":"ContainerStarted","Data":"daba3dcd52b2cc6dcee7258af5090b351b604e3ae81c090d17a8ffc8299cb87a"} Mar 20 16:00:01 crc kubenswrapper[4764]: I0320 16:00:01.383856 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567040-x4qfd" event={"ID":"ed49413b-1501-46fd-a4df-7e17198a21d7","Type":"ContainerStarted","Data":"cb2b6900f4c321bd600e0df73d967633ab337d340ee863c43c5d27ac184b13e6"} Mar 20 16:00:01 crc kubenswrapper[4764]: I0320 16:00:01.410030 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" podStartSLOduration=1.4100117810000001 podStartE2EDuration="1.410011781s" podCreationTimestamp="2026-03-20 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:00:01.396115821 +0000 UTC m=+4123.012304950" watchObservedRunningTime="2026-03-20 16:00:01.410011781 +0000 UTC m=+4123.026200910" Mar 20 16:00:02 crc kubenswrapper[4764]: I0320 16:00:02.394726 4764 generic.go:334] "Generic (PLEG): container finished" podID="2a80a95a-b770-4111-9e35-f00418c9f1bd" containerID="6cd1ff3eea1eeb0f64f9d4f40621cae0ebe6d07416d7a0615f4f298e8b017b94" exitCode=0 Mar 20 16:00:02 crc kubenswrapper[4764]: I0320 16:00:02.394973 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" event={"ID":"2a80a95a-b770-4111-9e35-f00418c9f1bd","Type":"ContainerDied","Data":"6cd1ff3eea1eeb0f64f9d4f40621cae0ebe6d07416d7a0615f4f298e8b017b94"} Mar 20 16:00:03 crc kubenswrapper[4764]: I0320 16:00:03.765691 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:03 crc kubenswrapper[4764]: I0320 16:00:03.875557 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jldq8\" (UniqueName: \"kubernetes.io/projected/2a80a95a-b770-4111-9e35-f00418c9f1bd-kube-api-access-jldq8\") pod \"2a80a95a-b770-4111-9e35-f00418c9f1bd\" (UID: \"2a80a95a-b770-4111-9e35-f00418c9f1bd\") " Mar 20 16:00:03 crc kubenswrapper[4764]: I0320 16:00:03.875666 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a80a95a-b770-4111-9e35-f00418c9f1bd-config-volume\") pod \"2a80a95a-b770-4111-9e35-f00418c9f1bd\" (UID: \"2a80a95a-b770-4111-9e35-f00418c9f1bd\") " Mar 20 16:00:03 crc kubenswrapper[4764]: I0320 16:00:03.875728 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a80a95a-b770-4111-9e35-f00418c9f1bd-secret-volume\") pod \"2a80a95a-b770-4111-9e35-f00418c9f1bd\" (UID: \"2a80a95a-b770-4111-9e35-f00418c9f1bd\") " Mar 20 16:00:03 crc kubenswrapper[4764]: I0320 16:00:03.876330 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a80a95a-b770-4111-9e35-f00418c9f1bd-config-volume" (OuterVolumeSpecName: "config-volume") pod "2a80a95a-b770-4111-9e35-f00418c9f1bd" (UID: "2a80a95a-b770-4111-9e35-f00418c9f1bd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 16:00:03 crc kubenswrapper[4764]: I0320 16:00:03.893596 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a80a95a-b770-4111-9e35-f00418c9f1bd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2a80a95a-b770-4111-9e35-f00418c9f1bd" (UID: "2a80a95a-b770-4111-9e35-f00418c9f1bd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:00:03 crc kubenswrapper[4764]: I0320 16:00:03.895577 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a80a95a-b770-4111-9e35-f00418c9f1bd-kube-api-access-jldq8" (OuterVolumeSpecName: "kube-api-access-jldq8") pod "2a80a95a-b770-4111-9e35-f00418c9f1bd" (UID: "2a80a95a-b770-4111-9e35-f00418c9f1bd"). InnerVolumeSpecName "kube-api-access-jldq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:00:03 crc kubenswrapper[4764]: I0320 16:00:03.977994 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jldq8\" (UniqueName: \"kubernetes.io/projected/2a80a95a-b770-4111-9e35-f00418c9f1bd-kube-api-access-jldq8\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:03 crc kubenswrapper[4764]: I0320 16:00:03.978223 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a80a95a-b770-4111-9e35-f00418c9f1bd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:03 crc kubenswrapper[4764]: I0320 16:00:03.978284 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a80a95a-b770-4111-9e35-f00418c9f1bd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:04 crc kubenswrapper[4764]: I0320 16:00:04.413084 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" event={"ID":"2a80a95a-b770-4111-9e35-f00418c9f1bd","Type":"ContainerDied","Data":"daba3dcd52b2cc6dcee7258af5090b351b604e3ae81c090d17a8ffc8299cb87a"} Mar 20 16:00:04 crc kubenswrapper[4764]: I0320 16:00:04.413299 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daba3dcd52b2cc6dcee7258af5090b351b604e3ae81c090d17a8ffc8299cb87a" Mar 20 16:00:04 crc kubenswrapper[4764]: I0320 16:00:04.413335 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567040-6pw5z" Mar 20 16:00:04 crc kubenswrapper[4764]: I0320 16:00:04.475342 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh"] Mar 20 16:00:04 crc kubenswrapper[4764]: I0320 16:00:04.490482 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566995-pwdxh"] Mar 20 16:00:05 crc kubenswrapper[4764]: I0320 16:00:05.138047 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="971cebcf-ddc9-443e-99aa-6e110026e383" path="/var/lib/kubelet/pods/971cebcf-ddc9-443e-99aa-6e110026e383/volumes" Mar 20 16:00:08 crc kubenswrapper[4764]: I0320 16:00:08.443320 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:00:08 crc kubenswrapper[4764]: I0320 16:00:08.443896 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:00:08 crc kubenswrapper[4764]: I0320 16:00:08.444757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567040-x4qfd" event={"ID":"ed49413b-1501-46fd-a4df-7e17198a21d7","Type":"ContainerStarted","Data":"febd3d27105ca5cb0b35515cf5c111388c57c0fffd34bea7035a60f96275a4eb"} Mar 20 16:00:08 crc kubenswrapper[4764]: I0320 16:00:08.464842 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567040-x4qfd" podStartSLOduration=1.673775421 podStartE2EDuration="8.464819733s" podCreationTimestamp="2026-03-20 16:00:00 +0000 UTC" firstStartedPulling="2026-03-20 16:00:01.157938522 +0000 UTC m=+4122.774127651" lastFinishedPulling="2026-03-20 16:00:07.948982834 +0000 UTC m=+4129.565171963" observedRunningTime="2026-03-20 16:00:08.457258259 +0000 UTC m=+4130.073447388" watchObservedRunningTime="2026-03-20 16:00:08.464819733 +0000 UTC m=+4130.081008862" Mar 20 16:00:09 crc kubenswrapper[4764]: I0320 16:00:09.457772 4764 generic.go:334] "Generic (PLEG): container finished" podID="ed49413b-1501-46fd-a4df-7e17198a21d7" containerID="febd3d27105ca5cb0b35515cf5c111388c57c0fffd34bea7035a60f96275a4eb" exitCode=0 Mar 20 16:00:09 crc kubenswrapper[4764]: I0320 16:00:09.457912 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567040-x4qfd" event={"ID":"ed49413b-1501-46fd-a4df-7e17198a21d7","Type":"ContainerDied","Data":"febd3d27105ca5cb0b35515cf5c111388c57c0fffd34bea7035a60f96275a4eb"} Mar 20 16:00:10 crc kubenswrapper[4764]: I0320 16:00:10.838157 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567040-x4qfd" Mar 20 16:00:11 crc kubenswrapper[4764]: I0320 16:00:11.009285 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn62f\" (UniqueName: \"kubernetes.io/projected/ed49413b-1501-46fd-a4df-7e17198a21d7-kube-api-access-nn62f\") pod \"ed49413b-1501-46fd-a4df-7e17198a21d7\" (UID: \"ed49413b-1501-46fd-a4df-7e17198a21d7\") " Mar 20 16:00:11 crc kubenswrapper[4764]: I0320 16:00:11.021587 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed49413b-1501-46fd-a4df-7e17198a21d7-kube-api-access-nn62f" (OuterVolumeSpecName: "kube-api-access-nn62f") pod "ed49413b-1501-46fd-a4df-7e17198a21d7" (UID: "ed49413b-1501-46fd-a4df-7e17198a21d7"). InnerVolumeSpecName "kube-api-access-nn62f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:00:11 crc kubenswrapper[4764]: I0320 16:00:11.111689 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn62f\" (UniqueName: \"kubernetes.io/projected/ed49413b-1501-46fd-a4df-7e17198a21d7-kube-api-access-nn62f\") on node \"crc\" DevicePath \"\"" Mar 20 16:00:11 crc kubenswrapper[4764]: I0320 16:00:11.473724 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567040-x4qfd" event={"ID":"ed49413b-1501-46fd-a4df-7e17198a21d7","Type":"ContainerDied","Data":"cb2b6900f4c321bd600e0df73d967633ab337d340ee863c43c5d27ac184b13e6"} Mar 20 16:00:11 crc kubenswrapper[4764]: I0320 16:00:11.473769 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb2b6900f4c321bd600e0df73d967633ab337d340ee863c43c5d27ac184b13e6" Mar 20 16:00:11 crc kubenswrapper[4764]: I0320 16:00:11.473816 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567040-x4qfd" Mar 20 16:00:11 crc kubenswrapper[4764]: I0320 16:00:11.530123 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567034-bpg9r"] Mar 20 16:00:11 crc kubenswrapper[4764]: I0320 16:00:11.538588 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567034-bpg9r"] Mar 20 16:00:11 crc kubenswrapper[4764]: I0320 16:00:11.792806 4764 scope.go:117] "RemoveContainer" containerID="76dccbd33eecbed08757eba58f315bbc66e0f8508f7e534e54afd7e588597f0c" Mar 20 16:00:13 crc kubenswrapper[4764]: I0320 16:00:13.136938 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6464809-48a0-4768-a2d8-4391ccfc04c4" path="/var/lib/kubelet/pods/d6464809-48a0-4768-a2d8-4391ccfc04c4/volumes" Mar 20 16:00:20 crc kubenswrapper[4764]: I0320 16:00:20.969785 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xmwp9_08bd5c50-7656-4a0a-9d9e-9f79eead7527/control-plane-machine-set-operator/0.log" Mar 20 16:00:21 crc kubenswrapper[4764]: I0320 16:00:21.119984 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cmp9t_7bc79882-ee25-421d-abfc-7d2684bd348f/kube-rbac-proxy/0.log" Mar 20 16:00:21 crc kubenswrapper[4764]: I0320 16:00:21.134085 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cmp9t_7bc79882-ee25-421d-abfc-7d2684bd348f/machine-api-operator/0.log" Mar 20 16:00:35 crc kubenswrapper[4764]: I0320 16:00:35.860269 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-hn9sv_e202f2fc-57d5-4c62-838e-0835c8add77c/cert-manager-controller/0.log" Mar 20 16:00:36 crc kubenswrapper[4764]: I0320 16:00:36.048026 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-ctqb5_0e9160a5-c44c-4f1c-8a83-2b4944b30542/cert-manager-cainjector/0.log" Mar 20 16:00:36 crc kubenswrapper[4764]: I0320 16:00:36.068578 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cdjgb_a78d7fb5-1e8d-45e3-8af0-105378a7c9ae/cert-manager-webhook/0.log" Mar 20 16:00:38 crc kubenswrapper[4764]: I0320 16:00:38.443366 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:00:38 crc kubenswrapper[4764]: I0320 16:00:38.443764 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:00:38 crc kubenswrapper[4764]: I0320 16:00:38.443816 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 16:00:38 crc kubenswrapper[4764]: I0320 16:00:38.444688 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"57e4847909fe1be173f31014a6d9365176dfb85598b100075a9cb0f8fc8dac5f"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:00:38 crc kubenswrapper[4764]: I0320 16:00:38.444748 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://57e4847909fe1be173f31014a6d9365176dfb85598b100075a9cb0f8fc8dac5f" gracePeriod=600 Mar 20 16:00:38 crc kubenswrapper[4764]: I0320 16:00:38.734993 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="57e4847909fe1be173f31014a6d9365176dfb85598b100075a9cb0f8fc8dac5f" exitCode=0 Mar 20 16:00:38 crc kubenswrapper[4764]: I0320 16:00:38.735802 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"57e4847909fe1be173f31014a6d9365176dfb85598b100075a9cb0f8fc8dac5f"} Mar 20 16:00:38 crc kubenswrapper[4764]: I0320 16:00:38.735858 4764 scope.go:117] "RemoveContainer" containerID="17fc3105f21f5310950493cceb69b006ef47457c22b8e6e04961e642d5cf69f3" Mar 20 16:00:39 crc kubenswrapper[4764]: I0320 16:00:39.746042 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerStarted","Data":"c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718"} Mar 20 16:00:51 crc kubenswrapper[4764]: I0320 16:00:51.273852 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-2k287_f7b7d942-96ce-48ee-a090-ac7dc8fd3e27/nmstate-console-plugin/0.log" Mar 20 16:00:51 crc kubenswrapper[4764]: I0320 16:00:51.525810 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vlrv5_6969b81c-842c-4f38-9749-0716f49aff6c/nmstate-handler/0.log" Mar 20 16:00:51 crc kubenswrapper[4764]: I0320 16:00:51.557046 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-j6b82_d682151d-aba5-445d-89fc-d0e67cf41258/kube-rbac-proxy/0.log" Mar 20 16:00:51 crc kubenswrapper[4764]: I0320 16:00:51.696324 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-j6b82_d682151d-aba5-445d-89fc-d0e67cf41258/nmstate-metrics/0.log" Mar 20 16:00:51 crc kubenswrapper[4764]: I0320 16:00:51.756543 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-hlh2h_12847360-40b1-45fc-aa82-6926fe0d9b8a/nmstate-operator/0.log" Mar 20 16:00:52 crc kubenswrapper[4764]: I0320 16:00:52.329498 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-mrkwg_b89d9212-3f44-4f7e-8a90-80752f91f8d8/nmstate-webhook/0.log" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.157445 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29567041-x4v7h"] Mar 20 16:01:00 crc kubenswrapper[4764]: E0320 16:01:00.158571 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a80a95a-b770-4111-9e35-f00418c9f1bd" containerName="collect-profiles" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.158591 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a80a95a-b770-4111-9e35-f00418c9f1bd" containerName="collect-profiles" Mar 20 16:01:00 crc kubenswrapper[4764]: E0320 16:01:00.158626 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed49413b-1501-46fd-a4df-7e17198a21d7" containerName="oc" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.158634 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed49413b-1501-46fd-a4df-7e17198a21d7" containerName="oc" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.158924 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed49413b-1501-46fd-a4df-7e17198a21d7" containerName="oc" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.158988 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a80a95a-b770-4111-9e35-f00418c9f1bd" containerName="collect-profiles" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.159814 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.170517 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567041-x4v7h"] Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.348390 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-config-data\") pod \"keystone-cron-29567041-x4v7h\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.348462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-fernet-keys\") pod \"keystone-cron-29567041-x4v7h\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.348550 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wrrn\" (UniqueName: \"kubernetes.io/projected/27ab51a1-e330-4b4b-ba33-92b3bb255950-kube-api-access-2wrrn\") pod \"keystone-cron-29567041-x4v7h\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.348577 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-combined-ca-bundle\") pod \"keystone-cron-29567041-x4v7h\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.449868 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wrrn\" (UniqueName: \"kubernetes.io/projected/27ab51a1-e330-4b4b-ba33-92b3bb255950-kube-api-access-2wrrn\") pod \"keystone-cron-29567041-x4v7h\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.449918 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-combined-ca-bundle\") pod \"keystone-cron-29567041-x4v7h\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.450001 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-config-data\") pod \"keystone-cron-29567041-x4v7h\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.450056 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-fernet-keys\") pod \"keystone-cron-29567041-x4v7h\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.461351 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-combined-ca-bundle\") pod \"keystone-cron-29567041-x4v7h\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.461466 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-fernet-keys\") pod \"keystone-cron-29567041-x4v7h\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.461719 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-config-data\") pod \"keystone-cron-29567041-x4v7h\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.481738 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wrrn\" (UniqueName: \"kubernetes.io/projected/27ab51a1-e330-4b4b-ba33-92b3bb255950-kube-api-access-2wrrn\") pod \"keystone-cron-29567041-x4v7h\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:00 crc kubenswrapper[4764]: I0320 16:01:00.488662 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:01 crc kubenswrapper[4764]: I0320 16:01:01.039711 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567041-x4v7h"] Mar 20 16:01:01 crc kubenswrapper[4764]: I0320 16:01:01.983482 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567041-x4v7h" event={"ID":"27ab51a1-e330-4b4b-ba33-92b3bb255950","Type":"ContainerStarted","Data":"1b30593886cdd3e2b6f1d6e72ebdbf07f65fd143e53cde9afc66296ba58ad546"} Mar 20 16:01:01 crc kubenswrapper[4764]: I0320 16:01:01.983806 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567041-x4v7h" event={"ID":"27ab51a1-e330-4b4b-ba33-92b3bb255950","Type":"ContainerStarted","Data":"21e87bb8c994574f2d2807f13cccf38a787c6518ad92749979d7700373797892"} Mar 20 16:01:02 crc kubenswrapper[4764]: I0320 16:01:02.008846 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29567041-x4v7h" podStartSLOduration=2.008821097 podStartE2EDuration="2.008821097s" podCreationTimestamp="2026-03-20 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 16:01:01.999808488 +0000 UTC m=+4183.615997617" watchObservedRunningTime="2026-03-20 16:01:02.008821097 +0000 UTC m=+4183.625010236" Mar 20 16:01:04 crc kubenswrapper[4764]: I0320 16:01:04.002393 4764 generic.go:334] "Generic (PLEG): container finished" podID="27ab51a1-e330-4b4b-ba33-92b3bb255950" containerID="1b30593886cdd3e2b6f1d6e72ebdbf07f65fd143e53cde9afc66296ba58ad546" exitCode=0 Mar 20 16:01:04 crc kubenswrapper[4764]: I0320 16:01:04.002511 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567041-x4v7h" event={"ID":"27ab51a1-e330-4b4b-ba33-92b3bb255950","Type":"ContainerDied","Data":"1b30593886cdd3e2b6f1d6e72ebdbf07f65fd143e53cde9afc66296ba58ad546"} Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.342881 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.448721 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-fernet-keys\") pod \"27ab51a1-e330-4b4b-ba33-92b3bb255950\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.448921 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wrrn\" (UniqueName: \"kubernetes.io/projected/27ab51a1-e330-4b4b-ba33-92b3bb255950-kube-api-access-2wrrn\") pod \"27ab51a1-e330-4b4b-ba33-92b3bb255950\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.448951 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-combined-ca-bundle\") pod \"27ab51a1-e330-4b4b-ba33-92b3bb255950\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.449035 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-config-data\") pod \"27ab51a1-e330-4b4b-ba33-92b3bb255950\" (UID: \"27ab51a1-e330-4b4b-ba33-92b3bb255950\") " Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.455235 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "27ab51a1-e330-4b4b-ba33-92b3bb255950" (UID: "27ab51a1-e330-4b4b-ba33-92b3bb255950"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.455546 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27ab51a1-e330-4b4b-ba33-92b3bb255950-kube-api-access-2wrrn" (OuterVolumeSpecName: "kube-api-access-2wrrn") pod "27ab51a1-e330-4b4b-ba33-92b3bb255950" (UID: "27ab51a1-e330-4b4b-ba33-92b3bb255950"). InnerVolumeSpecName "kube-api-access-2wrrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.490048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27ab51a1-e330-4b4b-ba33-92b3bb255950" (UID: "27ab51a1-e330-4b4b-ba33-92b3bb255950"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.539691 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-config-data" (OuterVolumeSpecName: "config-data") pod "27ab51a1-e330-4b4b-ba33-92b3bb255950" (UID: "27ab51a1-e330-4b4b-ba33-92b3bb255950"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.551630 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wrrn\" (UniqueName: \"kubernetes.io/projected/27ab51a1-e330-4b4b-ba33-92b3bb255950-kube-api-access-2wrrn\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.551661 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.551671 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:05 crc kubenswrapper[4764]: I0320 16:01:05.551679 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/27ab51a1-e330-4b4b-ba33-92b3bb255950-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 16:01:06 crc kubenswrapper[4764]: I0320 16:01:06.020884 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567041-x4v7h" event={"ID":"27ab51a1-e330-4b4b-ba33-92b3bb255950","Type":"ContainerDied","Data":"21e87bb8c994574f2d2807f13cccf38a787c6518ad92749979d7700373797892"} Mar 20 16:01:06 crc kubenswrapper[4764]: I0320 16:01:06.020924 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21e87bb8c994574f2d2807f13cccf38a787c6518ad92749979d7700373797892" Mar 20 16:01:06 crc kubenswrapper[4764]: I0320 16:01:06.020976 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567041-x4v7h" Mar 20 16:01:11 crc kubenswrapper[4764]: I0320 16:01:11.962276 4764 scope.go:117] "RemoveContainer" containerID="9ac16496a500ba03f349d47ad03ba524d07870ddba98e0ef7ac156b0dcaf6b36" Mar 20 16:01:22 crc kubenswrapper[4764]: I0320 16:01:22.669526 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-bbvxz_bbdf4541-5578-4f99-b7fb-bc3be4cd939a/kube-rbac-proxy/0.log" Mar 20 16:01:22 crc kubenswrapper[4764]: I0320 16:01:22.744156 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-bbvxz_bbdf4541-5578-4f99-b7fb-bc3be4cd939a/controller/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.302530 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/cp-frr-files/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.464294 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/cp-frr-files/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.471937 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/cp-reloader/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.506861 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/cp-metrics/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.524844 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/cp-reloader/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.674853 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/cp-frr-files/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.698138 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/cp-reloader/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.710375 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/cp-metrics/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.735777 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/cp-metrics/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.944776 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/cp-reloader/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.945090 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/cp-frr-files/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.957775 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/cp-metrics/0.log" Mar 20 16:01:23 crc kubenswrapper[4764]: I0320 16:01:23.959096 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/controller/0.log" Mar 20 16:01:24 crc kubenswrapper[4764]: I0320 16:01:24.157930 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/frr-metrics/0.log" Mar 20 16:01:24 crc kubenswrapper[4764]: I0320 16:01:24.174133 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/kube-rbac-proxy/0.log" Mar 20 16:01:24 crc kubenswrapper[4764]: I0320 16:01:24.179060 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/kube-rbac-proxy-frr/0.log" Mar 20 16:01:24 crc kubenswrapper[4764]: I0320 16:01:24.415559 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/reloader/0.log" Mar 20 16:01:24 crc kubenswrapper[4764]: I0320 16:01:24.467163 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-zwrnf_797c71ba-f218-4ebc-9109-c1a8e36a1f75/frr-k8s-webhook-server/0.log" Mar 20 16:01:24 crc kubenswrapper[4764]: I0320 16:01:24.649470 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-f7bc86596-htwbr_2a6607dc-2637-4ccd-85ee-63db60070729/manager/0.log" Mar 20 16:01:24 crc kubenswrapper[4764]: I0320 16:01:24.826975 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5766dfb6c-m75b9_fcf9844d-cd3b-400c-b7c2-ce3bc92f59e7/webhook-server/0.log" Mar 20 16:01:24 crc kubenswrapper[4764]: I0320 16:01:24.999013 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vc5vs_343e987b-163e-410c-a81d-83f3abb76064/kube-rbac-proxy/0.log" Mar 20 16:01:25 crc kubenswrapper[4764]: I0320 16:01:25.621948 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vc5vs_343e987b-163e-410c-a81d-83f3abb76064/speaker/0.log" Mar 20 16:01:25 crc kubenswrapper[4764]: I0320 16:01:25.928893 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r9q5r_3767b150-6238-4386-9155-4e198e0ee2d2/frr/0.log" Mar 20 16:01:39 crc kubenswrapper[4764]: I0320 16:01:39.354665 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th_66364f10-f71c-48d8-9691-6b840a589609/util/0.log" Mar 20 16:01:39 crc kubenswrapper[4764]: I0320 16:01:39.654665 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th_66364f10-f71c-48d8-9691-6b840a589609/util/0.log" Mar 20 16:01:39 crc kubenswrapper[4764]: I0320 16:01:39.673403 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th_66364f10-f71c-48d8-9691-6b840a589609/pull/0.log" Mar 20 16:01:39 crc kubenswrapper[4764]: I0320 16:01:39.707255 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th_66364f10-f71c-48d8-9691-6b840a589609/pull/0.log" Mar 20 16:01:39 crc kubenswrapper[4764]: I0320 16:01:39.883522 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th_66364f10-f71c-48d8-9691-6b840a589609/util/0.log" Mar 20 16:01:39 crc kubenswrapper[4764]: I0320 16:01:39.885417 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th_66364f10-f71c-48d8-9691-6b840a589609/pull/0.log" Mar 20 16:01:39 crc kubenswrapper[4764]: I0320 16:01:39.943790 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748h8th_66364f10-f71c-48d8-9691-6b840a589609/extract/0.log" Mar 20 16:01:40 crc kubenswrapper[4764]: I0320 16:01:40.053522 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm_28cc9951-8e26-4c96-8dae-0151015c425b/util/0.log" Mar 20 16:01:40 crc kubenswrapper[4764]: I0320 16:01:40.214593 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm_28cc9951-8e26-4c96-8dae-0151015c425b/util/0.log" Mar 20 16:01:40 crc kubenswrapper[4764]: I0320 16:01:40.251724 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm_28cc9951-8e26-4c96-8dae-0151015c425b/pull/0.log" Mar 20 16:01:40 crc kubenswrapper[4764]: I0320 16:01:40.261467 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm_28cc9951-8e26-4c96-8dae-0151015c425b/pull/0.log" Mar 20 16:01:40 crc kubenswrapper[4764]: I0320 16:01:40.986807 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm_28cc9951-8e26-4c96-8dae-0151015c425b/util/0.log" Mar 20 16:01:40 crc kubenswrapper[4764]: I0320 16:01:40.986854 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm_28cc9951-8e26-4c96-8dae-0151015c425b/extract/0.log" Mar 20 16:01:41 crc kubenswrapper[4764]: I0320 16:01:41.022637 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1dlbcm_28cc9951-8e26-4c96-8dae-0151015c425b/pull/0.log" Mar 20 16:01:41 crc kubenswrapper[4764]: I0320 16:01:41.145447 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dddff_68519d16-b007-47ac-9070-20ac1b3c3b2f/extract-utilities/0.log" Mar 20 16:01:41 crc kubenswrapper[4764]: I0320 16:01:41.335914 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dddff_68519d16-b007-47ac-9070-20ac1b3c3b2f/extract-utilities/0.log" Mar 20 16:01:41 crc kubenswrapper[4764]: I0320 16:01:41.383330 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dddff_68519d16-b007-47ac-9070-20ac1b3c3b2f/extract-content/0.log" Mar 20 16:01:41 crc kubenswrapper[4764]: I0320 16:01:41.403694 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dddff_68519d16-b007-47ac-9070-20ac1b3c3b2f/extract-content/0.log" Mar 20 16:01:41 crc kubenswrapper[4764]: I0320 16:01:41.561152 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dddff_68519d16-b007-47ac-9070-20ac1b3c3b2f/extract-content/0.log" Mar 20 16:01:41 crc kubenswrapper[4764]: I0320 16:01:41.574251 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dddff_68519d16-b007-47ac-9070-20ac1b3c3b2f/extract-utilities/0.log" Mar 20 16:01:41 crc kubenswrapper[4764]: I0320 16:01:41.758155 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4td2_656d8d0a-7ccf-4216-bde0-8e12af697dc0/extract-utilities/0.log" Mar 20 16:01:42 crc kubenswrapper[4764]: I0320 16:01:42.037635 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4td2_656d8d0a-7ccf-4216-bde0-8e12af697dc0/extract-content/0.log" Mar 20 16:01:42 crc kubenswrapper[4764]: I0320 16:01:42.054932 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4td2_656d8d0a-7ccf-4216-bde0-8e12af697dc0/extract-utilities/0.log" Mar 20 16:01:42 crc kubenswrapper[4764]: I0320 16:01:42.062127 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4td2_656d8d0a-7ccf-4216-bde0-8e12af697dc0/extract-content/0.log" Mar 20 16:01:42 crc kubenswrapper[4764]: I0320 16:01:42.176234 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dddff_68519d16-b007-47ac-9070-20ac1b3c3b2f/registry-server/0.log" Mar 20 16:01:42 crc kubenswrapper[4764]: I0320 16:01:42.698551 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4td2_656d8d0a-7ccf-4216-bde0-8e12af697dc0/extract-content/0.log" Mar 20 16:01:42 crc kubenswrapper[4764]: I0320 16:01:42.712645 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4td2_656d8d0a-7ccf-4216-bde0-8e12af697dc0/extract-utilities/0.log" Mar 20 16:01:42 crc kubenswrapper[4764]: I0320 16:01:42.992993 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-wgkcv_ad389133-cb52-4ed9-9261-daeb7a5cb13e/marketplace-operator/0.log" Mar 20 16:01:43 crc kubenswrapper[4764]: I0320 16:01:43.058073 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-25sqr_5da442f8-8370-4046-9d12-a9a301d61a58/extract-utilities/0.log" Mar 20 16:01:43 crc kubenswrapper[4764]: I0320 16:01:43.308328 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4td2_656d8d0a-7ccf-4216-bde0-8e12af697dc0/registry-server/0.log" Mar 20 16:01:43 crc kubenswrapper[4764]: I0320 16:01:43.344865 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-25sqr_5da442f8-8370-4046-9d12-a9a301d61a58/extract-content/0.log" Mar 20 16:01:43 crc kubenswrapper[4764]: I0320 16:01:43.400011 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-25sqr_5da442f8-8370-4046-9d12-a9a301d61a58/extract-utilities/0.log" Mar 20 16:01:43 crc kubenswrapper[4764]: I0320 16:01:43.406876 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-25sqr_5da442f8-8370-4046-9d12-a9a301d61a58/extract-content/0.log" Mar 20 16:01:43 crc kubenswrapper[4764]: I0320 16:01:43.595818 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-25sqr_5da442f8-8370-4046-9d12-a9a301d61a58/extract-utilities/0.log" Mar 20 16:01:43 crc kubenswrapper[4764]: I0320 16:01:43.697473 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-25sqr_5da442f8-8370-4046-9d12-a9a301d61a58/extract-content/0.log" Mar 20 16:01:43 crc kubenswrapper[4764]: I0320 16:01:43.750763 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-25sqr_5da442f8-8370-4046-9d12-a9a301d61a58/registry-server/0.log" Mar 20 16:01:43 crc kubenswrapper[4764]: I0320 16:01:43.780309 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2672k_d3ad1633-003c-4d92-bdfa-c0f6c1957cfd/extract-utilities/0.log" Mar 20 16:01:43 crc kubenswrapper[4764]: I0320 16:01:43.954159 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2672k_d3ad1633-003c-4d92-bdfa-c0f6c1957cfd/extract-content/0.log" Mar 20 16:01:43 crc kubenswrapper[4764]: I0320 16:01:43.961290 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2672k_d3ad1633-003c-4d92-bdfa-c0f6c1957cfd/extract-utilities/0.log" Mar 20 16:01:43 crc kubenswrapper[4764]: I0320 16:01:43.986080 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2672k_d3ad1633-003c-4d92-bdfa-c0f6c1957cfd/extract-content/0.log" Mar 20 16:01:44 crc kubenswrapper[4764]: I0320 16:01:44.207151 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2672k_d3ad1633-003c-4d92-bdfa-c0f6c1957cfd/extract-utilities/0.log" Mar 20 16:01:44 crc kubenswrapper[4764]: I0320 16:01:44.303512 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2672k_d3ad1633-003c-4d92-bdfa-c0f6c1957cfd/extract-content/0.log" Mar 20 16:01:44 crc kubenswrapper[4764]: I0320 16:01:44.806451 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2672k_d3ad1633-003c-4d92-bdfa-c0f6c1957cfd/registry-server/0.log" Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.148338 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567042-prmhk"] Mar 20 16:02:00 crc kubenswrapper[4764]: E0320 16:02:00.149392 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ab51a1-e330-4b4b-ba33-92b3bb255950" containerName="keystone-cron" Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.149409 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ab51a1-e330-4b4b-ba33-92b3bb255950" containerName="keystone-cron" Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.149616 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="27ab51a1-e330-4b4b-ba33-92b3bb255950" containerName="keystone-cron" Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.150254 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567042-prmhk" Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.152098 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.152206 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.152585 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.158136 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567042-prmhk"] Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.247743 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98fk8\" (UniqueName: \"kubernetes.io/projected/81f66340-cdef-4b02-889d-c43083631d74-kube-api-access-98fk8\") pod \"auto-csr-approver-29567042-prmhk\" (UID: \"81f66340-cdef-4b02-889d-c43083631d74\") " pod="openshift-infra/auto-csr-approver-29567042-prmhk" Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.352740 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98fk8\" (UniqueName: \"kubernetes.io/projected/81f66340-cdef-4b02-889d-c43083631d74-kube-api-access-98fk8\") pod \"auto-csr-approver-29567042-prmhk\" (UID: \"81f66340-cdef-4b02-889d-c43083631d74\") " pod="openshift-infra/auto-csr-approver-29567042-prmhk" Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.390159 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98fk8\" (UniqueName: \"kubernetes.io/projected/81f66340-cdef-4b02-889d-c43083631d74-kube-api-access-98fk8\") pod \"auto-csr-approver-29567042-prmhk\" (UID: \"81f66340-cdef-4b02-889d-c43083631d74\") " pod="openshift-infra/auto-csr-approver-29567042-prmhk" Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.472984 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567042-prmhk" Mar 20 16:02:00 crc kubenswrapper[4764]: W0320 16:02:00.956372 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81f66340_cdef_4b02_889d_c43083631d74.slice/crio-12b3a63e83358abdd287b19ac4d1d1d406e9cebaf1ec1f44c4deefbb91eb41c4 WatchSource:0}: Error finding container 12b3a63e83358abdd287b19ac4d1d1d406e9cebaf1ec1f44c4deefbb91eb41c4: Status 404 returned error can't find the container with id 12b3a63e83358abdd287b19ac4d1d1d406e9cebaf1ec1f44c4deefbb91eb41c4 Mar 20 16:02:00 crc kubenswrapper[4764]: I0320 16:02:00.956438 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567042-prmhk"] Mar 20 16:02:01 crc kubenswrapper[4764]: I0320 16:02:01.490562 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567042-prmhk" event={"ID":"81f66340-cdef-4b02-889d-c43083631d74","Type":"ContainerStarted","Data":"12b3a63e83358abdd287b19ac4d1d1d406e9cebaf1ec1f44c4deefbb91eb41c4"} Mar 20 16:02:03 crc kubenswrapper[4764]: I0320 16:02:03.505805 4764 generic.go:334] "Generic (PLEG): container finished" podID="81f66340-cdef-4b02-889d-c43083631d74" containerID="055e5af2b4ba3fa6fe7f6b01b179ad558c895321650653610d6c3a2f0e1cb544" exitCode=0 Mar 20 16:02:03 crc kubenswrapper[4764]: I0320 16:02:03.507019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567042-prmhk" event={"ID":"81f66340-cdef-4b02-889d-c43083631d74","Type":"ContainerDied","Data":"055e5af2b4ba3fa6fe7f6b01b179ad558c895321650653610d6c3a2f0e1cb544"} Mar 20 16:02:04 crc kubenswrapper[4764]: I0320 16:02:04.873805 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567042-prmhk" Mar 20 16:02:04 crc kubenswrapper[4764]: I0320 16:02:04.950213 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98fk8\" (UniqueName: \"kubernetes.io/projected/81f66340-cdef-4b02-889d-c43083631d74-kube-api-access-98fk8\") pod \"81f66340-cdef-4b02-889d-c43083631d74\" (UID: \"81f66340-cdef-4b02-889d-c43083631d74\") " Mar 20 16:02:04 crc kubenswrapper[4764]: I0320 16:02:04.956754 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f66340-cdef-4b02-889d-c43083631d74-kube-api-access-98fk8" (OuterVolumeSpecName: "kube-api-access-98fk8") pod "81f66340-cdef-4b02-889d-c43083631d74" (UID: "81f66340-cdef-4b02-889d-c43083631d74"). InnerVolumeSpecName "kube-api-access-98fk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:02:05 crc kubenswrapper[4764]: I0320 16:02:05.054936 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98fk8\" (UniqueName: \"kubernetes.io/projected/81f66340-cdef-4b02-889d-c43083631d74-kube-api-access-98fk8\") on node \"crc\" DevicePath \"\"" Mar 20 16:02:05 crc kubenswrapper[4764]: I0320 16:02:05.536843 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567042-prmhk" event={"ID":"81f66340-cdef-4b02-889d-c43083631d74","Type":"ContainerDied","Data":"12b3a63e83358abdd287b19ac4d1d1d406e9cebaf1ec1f44c4deefbb91eb41c4"} Mar 20 16:02:05 crc kubenswrapper[4764]: I0320 16:02:05.536886 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12b3a63e83358abdd287b19ac4d1d1d406e9cebaf1ec1f44c4deefbb91eb41c4" Mar 20 16:02:05 crc kubenswrapper[4764]: I0320 16:02:05.536952 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567042-prmhk" Mar 20 16:02:05 crc kubenswrapper[4764]: I0320 16:02:05.958870 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567036-7hjfl"] Mar 20 16:02:05 crc kubenswrapper[4764]: I0320 16:02:05.973884 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567036-7hjfl"] Mar 20 16:02:07 crc kubenswrapper[4764]: I0320 16:02:07.137082 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0022e3ae-b53d-4b86-9c04-bd57f64528c5" path="/var/lib/kubelet/pods/0022e3ae-b53d-4b86-9c04-bd57f64528c5/volumes" Mar 20 16:02:12 crc kubenswrapper[4764]: I0320 16:02:12.040347 4764 scope.go:117] "RemoveContainer" containerID="febddfc67e869fd54db9f23fa82570c0685aef0dfe210f7b92e440a08ead66c5" Mar 20 16:02:38 crc kubenswrapper[4764]: I0320 16:02:38.444327 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:02:38 crc kubenswrapper[4764]: I0320 16:02:38.444942 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:03:08 crc kubenswrapper[4764]: I0320 16:03:08.443288 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:03:08 crc kubenswrapper[4764]: I0320 16:03:08.445089 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:03:38 crc kubenswrapper[4764]: I0320 16:03:38.443668 4764 patch_prober.go:28] interesting pod/machine-config-daemon-6wln5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 16:03:38 crc kubenswrapper[4764]: I0320 16:03:38.444431 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 16:03:38 crc kubenswrapper[4764]: I0320 16:03:38.444501 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" Mar 20 16:03:38 crc kubenswrapper[4764]: I0320 16:03:38.445587 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718"} pod="openshift-machine-config-operator/machine-config-daemon-6wln5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 16:03:38 crc kubenswrapper[4764]: I0320 16:03:38.445668 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerName="machine-config-daemon" containerID="cri-o://c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" gracePeriod=600 Mar 20 16:03:38 crc kubenswrapper[4764]: E0320 16:03:38.592678 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:03:39 crc kubenswrapper[4764]: I0320 16:03:39.455670 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" exitCode=0 Mar 20 16:03:39 crc kubenswrapper[4764]: I0320 16:03:39.455767 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" event={"ID":"cf5cd911-963e-480f-8bc2-6be581e6d9e5","Type":"ContainerDied","Data":"c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718"} Mar 20 16:03:39 crc kubenswrapper[4764]: I0320 16:03:39.455838 4764 scope.go:117] "RemoveContainer" containerID="57e4847909fe1be173f31014a6d9365176dfb85598b100075a9cb0f8fc8dac5f" Mar 20 16:03:39 crc kubenswrapper[4764]: I0320 16:03:39.456426 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:03:39 crc kubenswrapper[4764]: E0320 16:03:39.456681 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:03:54 crc kubenswrapper[4764]: I0320 16:03:54.126816 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:03:54 crc kubenswrapper[4764]: E0320 16:03:54.127394 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:03:59 crc kubenswrapper[4764]: I0320 16:03:59.675010 4764 generic.go:334] "Generic (PLEG): container finished" podID="9b5dac44-c5fd-4b4a-9095-f5fb237f4b13" containerID="1621eb75720d41d35965fd43babccf5f2859e3dc7027bffd2a1201c019dbd437" exitCode=0 Mar 20 16:03:59 crc kubenswrapper[4764]: I0320 16:03:59.675037 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gtskb/must-gather-gvc5c" event={"ID":"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13","Type":"ContainerDied","Data":"1621eb75720d41d35965fd43babccf5f2859e3dc7027bffd2a1201c019dbd437"} Mar 20 16:03:59 crc kubenswrapper[4764]: I0320 16:03:59.676139 4764 scope.go:117] "RemoveContainer" containerID="1621eb75720d41d35965fd43babccf5f2859e3dc7027bffd2a1201c019dbd437" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.143009 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567044-rw7hp"] Mar 20 16:04:00 crc kubenswrapper[4764]: E0320 16:04:00.143575 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f66340-cdef-4b02-889d-c43083631d74" containerName="oc" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.143600 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f66340-cdef-4b02-889d-c43083631d74" containerName="oc" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.143909 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f66340-cdef-4b02-889d-c43083631d74" containerName="oc" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.144566 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-rw7hp" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.147137 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.147297 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.147331 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.151160 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-rw7hp"] Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.299046 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj2z9\" (UniqueName: \"kubernetes.io/projected/990e649f-0928-4ec7-acdd-1f3e4a980e7e-kube-api-access-dj2z9\") pod \"auto-csr-approver-29567044-rw7hp\" (UID: \"990e649f-0928-4ec7-acdd-1f3e4a980e7e\") " pod="openshift-infra/auto-csr-approver-29567044-rw7hp" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.307604 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gtskb_must-gather-gvc5c_9b5dac44-c5fd-4b4a-9095-f5fb237f4b13/gather/0.log" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.401279 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj2z9\" (UniqueName: \"kubernetes.io/projected/990e649f-0928-4ec7-acdd-1f3e4a980e7e-kube-api-access-dj2z9\") pod \"auto-csr-approver-29567044-rw7hp\" (UID: \"990e649f-0928-4ec7-acdd-1f3e4a980e7e\") " pod="openshift-infra/auto-csr-approver-29567044-rw7hp" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.420850 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj2z9\" (UniqueName: \"kubernetes.io/projected/990e649f-0928-4ec7-acdd-1f3e4a980e7e-kube-api-access-dj2z9\") pod \"auto-csr-approver-29567044-rw7hp\" (UID: \"990e649f-0928-4ec7-acdd-1f3e4a980e7e\") " pod="openshift-infra/auto-csr-approver-29567044-rw7hp" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.502625 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-rw7hp" Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.961350 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567044-rw7hp"] Mar 20 16:04:00 crc kubenswrapper[4764]: W0320 16:04:00.968335 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod990e649f_0928_4ec7_acdd_1f3e4a980e7e.slice/crio-3a562e0aa91ce266d7434e149a281064e52c216a43d39fc18018e0e9d54f3837 WatchSource:0}: Error finding container 3a562e0aa91ce266d7434e149a281064e52c216a43d39fc18018e0e9d54f3837: Status 404 returned error can't find the container with id 3a562e0aa91ce266d7434e149a281064e52c216a43d39fc18018e0e9d54f3837 Mar 20 16:04:00 crc kubenswrapper[4764]: I0320 16:04:00.972345 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 16:04:01 crc kubenswrapper[4764]: I0320 16:04:01.703032 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-rw7hp" event={"ID":"990e649f-0928-4ec7-acdd-1f3e4a980e7e","Type":"ContainerStarted","Data":"3a562e0aa91ce266d7434e149a281064e52c216a43d39fc18018e0e9d54f3837"} Mar 20 16:04:02 crc kubenswrapper[4764]: I0320 16:04:02.712704 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-rw7hp" event={"ID":"990e649f-0928-4ec7-acdd-1f3e4a980e7e","Type":"ContainerStarted","Data":"1fa2b67b7f54ffc4bcf49822df12be98e9b3596572d4a9db484e2d057993f852"} Mar 20 16:04:02 crc kubenswrapper[4764]: I0320 16:04:02.738910 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567044-rw7hp" podStartSLOduration=1.539103627 podStartE2EDuration="2.738888923s" podCreationTimestamp="2026-03-20 16:04:00 +0000 UTC" firstStartedPulling="2026-03-20 16:04:00.972074858 +0000 UTC m=+4362.588263987" lastFinishedPulling="2026-03-20 16:04:02.171860144 +0000 UTC m=+4363.788049283" observedRunningTime="2026-03-20 16:04:02.730159312 +0000 UTC m=+4364.346348461" watchObservedRunningTime="2026-03-20 16:04:02.738888923 +0000 UTC m=+4364.355078052" Mar 20 16:04:03 crc kubenswrapper[4764]: I0320 16:04:03.726118 4764 generic.go:334] "Generic (PLEG): container finished" podID="990e649f-0928-4ec7-acdd-1f3e4a980e7e" containerID="1fa2b67b7f54ffc4bcf49822df12be98e9b3596572d4a9db484e2d057993f852" exitCode=0 Mar 20 16:04:03 crc kubenswrapper[4764]: I0320 16:04:03.726435 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-rw7hp" event={"ID":"990e649f-0928-4ec7-acdd-1f3e4a980e7e","Type":"ContainerDied","Data":"1fa2b67b7f54ffc4bcf49822df12be98e9b3596572d4a9db484e2d057993f852"} Mar 20 16:04:05 crc kubenswrapper[4764]: I0320 16:04:05.126954 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:04:05 crc kubenswrapper[4764]: E0320 16:04:05.127461 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:04:05 crc kubenswrapper[4764]: I0320 16:04:05.589468 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-rw7hp" Mar 20 16:04:05 crc kubenswrapper[4764]: I0320 16:04:05.729124 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj2z9\" (UniqueName: \"kubernetes.io/projected/990e649f-0928-4ec7-acdd-1f3e4a980e7e-kube-api-access-dj2z9\") pod \"990e649f-0928-4ec7-acdd-1f3e4a980e7e\" (UID: \"990e649f-0928-4ec7-acdd-1f3e4a980e7e\") " Mar 20 16:04:05 crc kubenswrapper[4764]: I0320 16:04:05.736198 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/990e649f-0928-4ec7-acdd-1f3e4a980e7e-kube-api-access-dj2z9" (OuterVolumeSpecName: "kube-api-access-dj2z9") pod "990e649f-0928-4ec7-acdd-1f3e4a980e7e" (UID: "990e649f-0928-4ec7-acdd-1f3e4a980e7e"). InnerVolumeSpecName "kube-api-access-dj2z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:04:05 crc kubenswrapper[4764]: I0320 16:04:05.746088 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567044-rw7hp" event={"ID":"990e649f-0928-4ec7-acdd-1f3e4a980e7e","Type":"ContainerDied","Data":"3a562e0aa91ce266d7434e149a281064e52c216a43d39fc18018e0e9d54f3837"} Mar 20 16:04:05 crc kubenswrapper[4764]: I0320 16:04:05.746127 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a562e0aa91ce266d7434e149a281064e52c216a43d39fc18018e0e9d54f3837" Mar 20 16:04:05 crc kubenswrapper[4764]: I0320 16:04:05.746181 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567044-rw7hp" Mar 20 16:04:05 crc kubenswrapper[4764]: I0320 16:04:05.796119 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567038-vzfgd"] Mar 20 16:04:05 crc kubenswrapper[4764]: I0320 16:04:05.805246 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567038-vzfgd"] Mar 20 16:04:05 crc kubenswrapper[4764]: I0320 16:04:05.832259 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj2z9\" (UniqueName: \"kubernetes.io/projected/990e649f-0928-4ec7-acdd-1f3e4a980e7e-kube-api-access-dj2z9\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:07 crc kubenswrapper[4764]: I0320 16:04:07.140500 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c296ef8-0537-4c67-a4cb-bcd3fe6211d6" path="/var/lib/kubelet/pods/4c296ef8-0537-4c67-a4cb-bcd3fe6211d6/volumes" Mar 20 16:04:08 crc kubenswrapper[4764]: I0320 16:04:08.370891 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gtskb/must-gather-gvc5c"] Mar 20 16:04:08 crc kubenswrapper[4764]: I0320 16:04:08.371209 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-gtskb/must-gather-gvc5c" podUID="9b5dac44-c5fd-4b4a-9095-f5fb237f4b13" containerName="copy" containerID="cri-o://6da3866c15059f3ade16ff9e4ea75d213f3b63ba3e9cba6be3c114c7c16e8e6f" gracePeriod=2 Mar 20 16:04:08 crc kubenswrapper[4764]: I0320 16:04:08.381967 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gtskb/must-gather-gvc5c"] Mar 20 16:04:08 crc kubenswrapper[4764]: I0320 16:04:08.777771 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gtskb_must-gather-gvc5c_9b5dac44-c5fd-4b4a-9095-f5fb237f4b13/copy/0.log" Mar 20 16:04:08 crc kubenswrapper[4764]: I0320 16:04:08.778650 4764 generic.go:334] "Generic (PLEG): container finished" podID="9b5dac44-c5fd-4b4a-9095-f5fb237f4b13" containerID="6da3866c15059f3ade16ff9e4ea75d213f3b63ba3e9cba6be3c114c7c16e8e6f" exitCode=143 Mar 20 16:04:08 crc kubenswrapper[4764]: I0320 16:04:08.921994 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gtskb_must-gather-gvc5c_9b5dac44-c5fd-4b4a-9095-f5fb237f4b13/copy/0.log" Mar 20 16:04:08 crc kubenswrapper[4764]: I0320 16:04:08.922333 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/must-gather-gvc5c" Mar 20 16:04:08 crc kubenswrapper[4764]: I0320 16:04:08.996844 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13-must-gather-output\") pod \"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13\" (UID: \"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13\") " Mar 20 16:04:08 crc kubenswrapper[4764]: I0320 16:04:08.997130 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdkgv\" (UniqueName: \"kubernetes.io/projected/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13-kube-api-access-bdkgv\") pod \"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13\" (UID: \"9b5dac44-c5fd-4b4a-9095-f5fb237f4b13\") " Mar 20 16:04:09 crc kubenswrapper[4764]: I0320 16:04:09.002740 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13-kube-api-access-bdkgv" (OuterVolumeSpecName: "kube-api-access-bdkgv") pod "9b5dac44-c5fd-4b4a-9095-f5fb237f4b13" (UID: "9b5dac44-c5fd-4b4a-9095-f5fb237f4b13"). InnerVolumeSpecName "kube-api-access-bdkgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:04:09 crc kubenswrapper[4764]: I0320 16:04:09.099900 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdkgv\" (UniqueName: \"kubernetes.io/projected/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13-kube-api-access-bdkgv\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:09 crc kubenswrapper[4764]: I0320 16:04:09.193208 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9b5dac44-c5fd-4b4a-9095-f5fb237f4b13" (UID: "9b5dac44-c5fd-4b4a-9095-f5fb237f4b13"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:04:09 crc kubenswrapper[4764]: I0320 16:04:09.202138 4764 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:09 crc kubenswrapper[4764]: I0320 16:04:09.789714 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gtskb_must-gather-gvc5c_9b5dac44-c5fd-4b4a-9095-f5fb237f4b13/copy/0.log" Mar 20 16:04:09 crc kubenswrapper[4764]: I0320 16:04:09.790242 4764 scope.go:117] "RemoveContainer" containerID="6da3866c15059f3ade16ff9e4ea75d213f3b63ba3e9cba6be3c114c7c16e8e6f" Mar 20 16:04:09 crc kubenswrapper[4764]: I0320 16:04:09.790426 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gtskb/must-gather-gvc5c" Mar 20 16:04:09 crc kubenswrapper[4764]: I0320 16:04:09.819555 4764 scope.go:117] "RemoveContainer" containerID="1621eb75720d41d35965fd43babccf5f2859e3dc7027bffd2a1201c019dbd437" Mar 20 16:04:11 crc kubenswrapper[4764]: I0320 16:04:11.139272 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b5dac44-c5fd-4b4a-9095-f5fb237f4b13" path="/var/lib/kubelet/pods/9b5dac44-c5fd-4b4a-9095-f5fb237f4b13/volumes" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.133437 4764 scope.go:117] "RemoveContainer" containerID="0440f51eca52947dbe76d0dec6fec5fe69c76555741152e187ee1e4bb83eadde" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.180840 4764 scope.go:117] "RemoveContainer" containerID="41bf334453cfa54e9f75adb3040a4129857833408c37eac2463ab7d30ab28294" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.329556 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h87j7"] Mar 20 16:04:12 crc kubenswrapper[4764]: E0320 16:04:12.329955 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5dac44-c5fd-4b4a-9095-f5fb237f4b13" containerName="copy" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.329970 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5dac44-c5fd-4b4a-9095-f5fb237f4b13" containerName="copy" Mar 20 16:04:12 crc kubenswrapper[4764]: E0320 16:04:12.329984 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="990e649f-0928-4ec7-acdd-1f3e4a980e7e" containerName="oc" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.329989 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="990e649f-0928-4ec7-acdd-1f3e4a980e7e" containerName="oc" Mar 20 16:04:12 crc kubenswrapper[4764]: E0320 16:04:12.330010 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5dac44-c5fd-4b4a-9095-f5fb237f4b13" containerName="gather" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.330016 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5dac44-c5fd-4b4a-9095-f5fb237f4b13" containerName="gather" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.330183 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="990e649f-0928-4ec7-acdd-1f3e4a980e7e" containerName="oc" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.330202 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b5dac44-c5fd-4b4a-9095-f5fb237f4b13" containerName="gather" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.330216 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b5dac44-c5fd-4b4a-9095-f5fb237f4b13" containerName="copy" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.335417 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.343615 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h87j7"] Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.370654 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-utilities\") pod \"certified-operators-h87j7\" (UID: \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\") " pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.370731 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-catalog-content\") pod \"certified-operators-h87j7\" (UID: \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\") " pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.370759 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx5jg\" (UniqueName: \"kubernetes.io/projected/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-kube-api-access-wx5jg\") pod \"certified-operators-h87j7\" (UID: \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\") " pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.473021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-utilities\") pod \"certified-operators-h87j7\" (UID: \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\") " pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.473100 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-catalog-content\") pod \"certified-operators-h87j7\" (UID: \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\") " pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.473130 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx5jg\" (UniqueName: \"kubernetes.io/projected/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-kube-api-access-wx5jg\") pod \"certified-operators-h87j7\" (UID: \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\") " pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.473563 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-utilities\") pod \"certified-operators-h87j7\" (UID: \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\") " pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.473573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-catalog-content\") pod \"certified-operators-h87j7\" (UID: \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\") " pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.493020 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx5jg\" (UniqueName: \"kubernetes.io/projected/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-kube-api-access-wx5jg\") pod \"certified-operators-h87j7\" (UID: \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\") " pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:12 crc kubenswrapper[4764]: I0320 16:04:12.660367 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:13 crc kubenswrapper[4764]: I0320 16:04:13.311286 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h87j7"] Mar 20 16:04:13 crc kubenswrapper[4764]: I0320 16:04:13.829697 4764 generic.go:334] "Generic (PLEG): container finished" podID="5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" containerID="06a0879ddf4e9734c8b264b9ae1be27a105534cf413e90b0f0964df5278ae28c" exitCode=0 Mar 20 16:04:13 crc kubenswrapper[4764]: I0320 16:04:13.829763 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h87j7" event={"ID":"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4","Type":"ContainerDied","Data":"06a0879ddf4e9734c8b264b9ae1be27a105534cf413e90b0f0964df5278ae28c"} Mar 20 16:04:13 crc kubenswrapper[4764]: I0320 16:04:13.829807 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h87j7" event={"ID":"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4","Type":"ContainerStarted","Data":"5556be4177031722ceb84217a6a13da41f921dddee5976e7ab6fded5e23f1bdb"} Mar 20 16:04:15 crc kubenswrapper[4764]: I0320 16:04:15.850151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h87j7" event={"ID":"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4","Type":"ContainerStarted","Data":"05d4fb3eaf65cb449e1bd0e41572d8d1a30a37224f4e0720f87bfdaa4ada93f4"} Mar 20 16:04:16 crc kubenswrapper[4764]: I0320 16:04:16.860823 4764 generic.go:334] "Generic (PLEG): container finished" podID="5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" containerID="05d4fb3eaf65cb449e1bd0e41572d8d1a30a37224f4e0720f87bfdaa4ada93f4" exitCode=0 Mar 20 16:04:16 crc kubenswrapper[4764]: I0320 16:04:16.860936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h87j7" event={"ID":"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4","Type":"ContainerDied","Data":"05d4fb3eaf65cb449e1bd0e41572d8d1a30a37224f4e0720f87bfdaa4ada93f4"} Mar 20 16:04:17 crc kubenswrapper[4764]: I0320 16:04:17.870521 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h87j7" event={"ID":"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4","Type":"ContainerStarted","Data":"5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8"} Mar 20 16:04:17 crc kubenswrapper[4764]: I0320 16:04:17.892894 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h87j7" podStartSLOduration=2.279510853 podStartE2EDuration="5.892870157s" podCreationTimestamp="2026-03-20 16:04:12 +0000 UTC" firstStartedPulling="2026-03-20 16:04:13.832775606 +0000 UTC m=+4375.448964775" lastFinishedPulling="2026-03-20 16:04:17.44613495 +0000 UTC m=+4379.062324079" observedRunningTime="2026-03-20 16:04:17.886412977 +0000 UTC m=+4379.502602106" watchObservedRunningTime="2026-03-20 16:04:17.892870157 +0000 UTC m=+4379.509059286" Mar 20 16:04:18 crc kubenswrapper[4764]: I0320 16:04:18.127283 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:04:18 crc kubenswrapper[4764]: E0320 16:04:18.127604 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:04:22 crc kubenswrapper[4764]: I0320 16:04:22.661277 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:22 crc kubenswrapper[4764]: I0320 16:04:22.661949 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:22 crc kubenswrapper[4764]: I0320 16:04:22.727149 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:22 crc kubenswrapper[4764]: I0320 16:04:22.978101 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:23 crc kubenswrapper[4764]: I0320 16:04:23.051301 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h87j7"] Mar 20 16:04:24 crc kubenswrapper[4764]: I0320 16:04:24.947091 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h87j7" podUID="5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" containerName="registry-server" containerID="cri-o://5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8" gracePeriod=2 Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.504177 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.622876 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx5jg\" (UniqueName: \"kubernetes.io/projected/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-kube-api-access-wx5jg\") pod \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\" (UID: \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\") " Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.623248 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-utilities\") pod \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\" (UID: \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\") " Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.623415 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-catalog-content\") pod \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\" (UID: \"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4\") " Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.624134 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-utilities" (OuterVolumeSpecName: "utilities") pod "5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" (UID: "5b7cbc55-9a87-4658-9674-6ea5bebbcaa4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.631349 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-kube-api-access-wx5jg" (OuterVolumeSpecName: "kube-api-access-wx5jg") pod "5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" (UID: "5b7cbc55-9a87-4658-9674-6ea5bebbcaa4"). InnerVolumeSpecName "kube-api-access-wx5jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.725894 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx5jg\" (UniqueName: \"kubernetes.io/projected/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-kube-api-access-wx5jg\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.725947 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.960943 4764 generic.go:334] "Generic (PLEG): container finished" podID="5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" containerID="5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8" exitCode=0 Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.961010 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h87j7" Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.961036 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h87j7" event={"ID":"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4","Type":"ContainerDied","Data":"5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8"} Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.961923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h87j7" event={"ID":"5b7cbc55-9a87-4658-9674-6ea5bebbcaa4","Type":"ContainerDied","Data":"5556be4177031722ceb84217a6a13da41f921dddee5976e7ab6fded5e23f1bdb"} Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.961963 4764 scope.go:117] "RemoveContainer" containerID="5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8" Mar 20 16:04:25 crc kubenswrapper[4764]: I0320 16:04:25.982391 4764 scope.go:117] "RemoveContainer" containerID="05d4fb3eaf65cb449e1bd0e41572d8d1a30a37224f4e0720f87bfdaa4ada93f4" Mar 20 16:04:26 crc kubenswrapper[4764]: I0320 16:04:26.024693 4764 scope.go:117] "RemoveContainer" containerID="06a0879ddf4e9734c8b264b9ae1be27a105534cf413e90b0f0964df5278ae28c" Mar 20 16:04:26 crc kubenswrapper[4764]: I0320 16:04:26.095578 4764 scope.go:117] "RemoveContainer" containerID="5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8" Mar 20 16:04:26 crc kubenswrapper[4764]: E0320 16:04:26.096135 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8\": container with ID starting with 5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8 not found: ID does not exist" containerID="5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8" Mar 20 16:04:26 crc kubenswrapper[4764]: I0320 16:04:26.096207 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8"} err="failed to get container status \"5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8\": rpc error: code = NotFound desc = could not find container \"5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8\": container with ID starting with 5c2483ddb65bfb3b9374ac7c9a4f98f63b8c4bf5ffbcb8aa16dfdda356bb37a8 not found: ID does not exist" Mar 20 16:04:26 crc kubenswrapper[4764]: I0320 16:04:26.096250 4764 scope.go:117] "RemoveContainer" containerID="05d4fb3eaf65cb449e1bd0e41572d8d1a30a37224f4e0720f87bfdaa4ada93f4" Mar 20 16:04:26 crc kubenswrapper[4764]: E0320 16:04:26.096906 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05d4fb3eaf65cb449e1bd0e41572d8d1a30a37224f4e0720f87bfdaa4ada93f4\": container with ID starting with 05d4fb3eaf65cb449e1bd0e41572d8d1a30a37224f4e0720f87bfdaa4ada93f4 not found: ID does not exist" containerID="05d4fb3eaf65cb449e1bd0e41572d8d1a30a37224f4e0720f87bfdaa4ada93f4" Mar 20 16:04:26 crc kubenswrapper[4764]: I0320 16:04:26.096954 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d4fb3eaf65cb449e1bd0e41572d8d1a30a37224f4e0720f87bfdaa4ada93f4"} err="failed to get container status \"05d4fb3eaf65cb449e1bd0e41572d8d1a30a37224f4e0720f87bfdaa4ada93f4\": rpc error: code = NotFound desc = could not find container \"05d4fb3eaf65cb449e1bd0e41572d8d1a30a37224f4e0720f87bfdaa4ada93f4\": container with ID starting with 05d4fb3eaf65cb449e1bd0e41572d8d1a30a37224f4e0720f87bfdaa4ada93f4 not found: ID does not exist" Mar 20 16:04:26 crc kubenswrapper[4764]: I0320 16:04:26.096987 4764 scope.go:117] "RemoveContainer" containerID="06a0879ddf4e9734c8b264b9ae1be27a105534cf413e90b0f0964df5278ae28c" Mar 20 16:04:26 crc kubenswrapper[4764]: E0320 16:04:26.097966 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06a0879ddf4e9734c8b264b9ae1be27a105534cf413e90b0f0964df5278ae28c\": container with ID starting with 06a0879ddf4e9734c8b264b9ae1be27a105534cf413e90b0f0964df5278ae28c not found: ID does not exist" containerID="06a0879ddf4e9734c8b264b9ae1be27a105534cf413e90b0f0964df5278ae28c" Mar 20 16:04:26 crc kubenswrapper[4764]: I0320 16:04:26.098070 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06a0879ddf4e9734c8b264b9ae1be27a105534cf413e90b0f0964df5278ae28c"} err="failed to get container status \"06a0879ddf4e9734c8b264b9ae1be27a105534cf413e90b0f0964df5278ae28c\": rpc error: code = NotFound desc = could not find container \"06a0879ddf4e9734c8b264b9ae1be27a105534cf413e90b0f0964df5278ae28c\": container with ID starting with 06a0879ddf4e9734c8b264b9ae1be27a105534cf413e90b0f0964df5278ae28c not found: ID does not exist" Mar 20 16:04:26 crc kubenswrapper[4764]: I0320 16:04:26.128210 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" (UID: "5b7cbc55-9a87-4658-9674-6ea5bebbcaa4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 16:04:26 crc kubenswrapper[4764]: I0320 16:04:26.137971 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 16:04:26 crc kubenswrapper[4764]: I0320 16:04:26.320812 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h87j7"] Mar 20 16:04:26 crc kubenswrapper[4764]: I0320 16:04:26.330224 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h87j7"] Mar 20 16:04:27 crc kubenswrapper[4764]: I0320 16:04:27.137894 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" path="/var/lib/kubelet/pods/5b7cbc55-9a87-4658-9674-6ea5bebbcaa4/volumes" Mar 20 16:04:33 crc kubenswrapper[4764]: I0320 16:04:33.126567 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:04:33 crc kubenswrapper[4764]: E0320 16:04:33.127666 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:04:48 crc kubenswrapper[4764]: I0320 16:04:48.126615 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:04:48 crc kubenswrapper[4764]: E0320 16:04:48.127324 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:04:59 crc kubenswrapper[4764]: I0320 16:04:59.132185 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:04:59 crc kubenswrapper[4764]: E0320 16:04:59.133005 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:05:12 crc kubenswrapper[4764]: I0320 16:05:12.280259 4764 scope.go:117] "RemoveContainer" containerID="1148f46a2fc0f16672e25a26cef7b7f41995c3d88f61b7bd07ea48022ec50b6e" Mar 20 16:05:13 crc kubenswrapper[4764]: I0320 16:05:13.126427 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:05:13 crc kubenswrapper[4764]: E0320 16:05:13.126960 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:05:28 crc kubenswrapper[4764]: I0320 16:05:28.126587 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:05:28 crc kubenswrapper[4764]: E0320 16:05:28.127489 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:05:42 crc kubenswrapper[4764]: I0320 16:05:42.126938 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:05:42 crc kubenswrapper[4764]: E0320 16:05:42.127740 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:05:57 crc kubenswrapper[4764]: I0320 16:05:57.127005 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:05:57 crc kubenswrapper[4764]: E0320 16:05:57.127857 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.150419 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567046-jpm8c"] Mar 20 16:06:00 crc kubenswrapper[4764]: E0320 16:06:00.151347 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" containerName="extract-content" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.151360 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" containerName="extract-content" Mar 20 16:06:00 crc kubenswrapper[4764]: E0320 16:06:00.151410 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" containerName="extract-utilities" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.151417 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" containerName="extract-utilities" Mar 20 16:06:00 crc kubenswrapper[4764]: E0320 16:06:00.151429 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" containerName="registry-server" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.151435 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" containerName="registry-server" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.151627 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b7cbc55-9a87-4658-9674-6ea5bebbcaa4" containerName="registry-server" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.152358 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-jpm8c" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.158687 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.158730 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.158895 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.161164 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-jpm8c"] Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.273770 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cws54\" (UniqueName: \"kubernetes.io/projected/e796256c-a1f4-4094-aaee-7b1721bca646-kube-api-access-cws54\") pod \"auto-csr-approver-29567046-jpm8c\" (UID: \"e796256c-a1f4-4094-aaee-7b1721bca646\") " pod="openshift-infra/auto-csr-approver-29567046-jpm8c" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.375618 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cws54\" (UniqueName: \"kubernetes.io/projected/e796256c-a1f4-4094-aaee-7b1721bca646-kube-api-access-cws54\") pod \"auto-csr-approver-29567046-jpm8c\" (UID: \"e796256c-a1f4-4094-aaee-7b1721bca646\") " pod="openshift-infra/auto-csr-approver-29567046-jpm8c" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.398008 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cws54\" (UniqueName: \"kubernetes.io/projected/e796256c-a1f4-4094-aaee-7b1721bca646-kube-api-access-cws54\") pod \"auto-csr-approver-29567046-jpm8c\" (UID: \"e796256c-a1f4-4094-aaee-7b1721bca646\") " pod="openshift-infra/auto-csr-approver-29567046-jpm8c" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.472041 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-jpm8c" Mar 20 16:06:00 crc kubenswrapper[4764]: I0320 16:06:00.917845 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567046-jpm8c"] Mar 20 16:06:01 crc kubenswrapper[4764]: I0320 16:06:01.812580 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-jpm8c" event={"ID":"e796256c-a1f4-4094-aaee-7b1721bca646","Type":"ContainerStarted","Data":"68d0b39375cd5383544b4343bcab427d49a2890b3b6ca9a0e13ac3ca63a5039b"} Mar 20 16:06:02 crc kubenswrapper[4764]: I0320 16:06:02.823835 4764 generic.go:334] "Generic (PLEG): container finished" podID="e796256c-a1f4-4094-aaee-7b1721bca646" containerID="456a1b6998458f46428ec4a718000750f2779d409ad2fcd3a0b52f796608479c" exitCode=0 Mar 20 16:06:02 crc kubenswrapper[4764]: I0320 16:06:02.823929 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-jpm8c" event={"ID":"e796256c-a1f4-4094-aaee-7b1721bca646","Type":"ContainerDied","Data":"456a1b6998458f46428ec4a718000750f2779d409ad2fcd3a0b52f796608479c"} Mar 20 16:06:04 crc kubenswrapper[4764]: I0320 16:06:04.153083 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-jpm8c" Mar 20 16:06:04 crc kubenswrapper[4764]: I0320 16:06:04.262408 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cws54\" (UniqueName: \"kubernetes.io/projected/e796256c-a1f4-4094-aaee-7b1721bca646-kube-api-access-cws54\") pod \"e796256c-a1f4-4094-aaee-7b1721bca646\" (UID: \"e796256c-a1f4-4094-aaee-7b1721bca646\") " Mar 20 16:06:04 crc kubenswrapper[4764]: I0320 16:06:04.268813 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e796256c-a1f4-4094-aaee-7b1721bca646-kube-api-access-cws54" (OuterVolumeSpecName: "kube-api-access-cws54") pod "e796256c-a1f4-4094-aaee-7b1721bca646" (UID: "e796256c-a1f4-4094-aaee-7b1721bca646"). InnerVolumeSpecName "kube-api-access-cws54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:06:04 crc kubenswrapper[4764]: I0320 16:06:04.364612 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cws54\" (UniqueName: \"kubernetes.io/projected/e796256c-a1f4-4094-aaee-7b1721bca646-kube-api-access-cws54\") on node \"crc\" DevicePath \"\"" Mar 20 16:06:04 crc kubenswrapper[4764]: I0320 16:06:04.844726 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567046-jpm8c" event={"ID":"e796256c-a1f4-4094-aaee-7b1721bca646","Type":"ContainerDied","Data":"68d0b39375cd5383544b4343bcab427d49a2890b3b6ca9a0e13ac3ca63a5039b"} Mar 20 16:06:04 crc kubenswrapper[4764]: I0320 16:06:04.844765 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68d0b39375cd5383544b4343bcab427d49a2890b3b6ca9a0e13ac3ca63a5039b" Mar 20 16:06:04 crc kubenswrapper[4764]: I0320 16:06:04.844798 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567046-jpm8c" Mar 20 16:06:05 crc kubenswrapper[4764]: I0320 16:06:05.236982 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567040-x4qfd"] Mar 20 16:06:05 crc kubenswrapper[4764]: I0320 16:06:05.245829 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567040-x4qfd"] Mar 20 16:06:07 crc kubenswrapper[4764]: I0320 16:06:07.139825 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed49413b-1501-46fd-a4df-7e17198a21d7" path="/var/lib/kubelet/pods/ed49413b-1501-46fd-a4df-7e17198a21d7/volumes" Mar 20 16:06:08 crc kubenswrapper[4764]: I0320 16:06:08.126514 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:06:08 crc kubenswrapper[4764]: E0320 16:06:08.127115 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:06:12 crc kubenswrapper[4764]: I0320 16:06:12.360908 4764 scope.go:117] "RemoveContainer" containerID="febd3d27105ca5cb0b35515cf5c111388c57c0fffd34bea7035a60f96275a4eb" Mar 20 16:06:19 crc kubenswrapper[4764]: I0320 16:06:19.136734 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:06:19 crc kubenswrapper[4764]: E0320 16:06:19.137659 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:06:33 crc kubenswrapper[4764]: I0320 16:06:33.127017 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:06:33 crc kubenswrapper[4764]: E0320 16:06:33.128258 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:06:45 crc kubenswrapper[4764]: I0320 16:06:45.130098 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:06:45 crc kubenswrapper[4764]: E0320 16:06:45.130881 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:06:58 crc kubenswrapper[4764]: I0320 16:06:58.125884 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:06:58 crc kubenswrapper[4764]: E0320 16:06:58.126769 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:07:13 crc kubenswrapper[4764]: I0320 16:07:13.126689 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:07:13 crc kubenswrapper[4764]: E0320 16:07:13.127522 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:07:28 crc kubenswrapper[4764]: I0320 16:07:28.126954 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:07:28 crc kubenswrapper[4764]: E0320 16:07:28.127732 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:07:43 crc kubenswrapper[4764]: I0320 16:07:43.128027 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:07:43 crc kubenswrapper[4764]: E0320 16:07:43.129139 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:07:57 crc kubenswrapper[4764]: I0320 16:07:57.127263 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:07:57 crc kubenswrapper[4764]: E0320 16:07:57.128948 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.146899 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567048-k4s22"] Mar 20 16:08:00 crc kubenswrapper[4764]: E0320 16:08:00.147833 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e796256c-a1f4-4094-aaee-7b1721bca646" containerName="oc" Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.147848 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e796256c-a1f4-4094-aaee-7b1721bca646" containerName="oc" Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.148080 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e796256c-a1f4-4094-aaee-7b1721bca646" containerName="oc" Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.148754 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-k4s22" Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.151520 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.152131 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qkmp7" Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.152454 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.160930 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-k4s22"] Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.279852 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xxzz\" (UniqueName: \"kubernetes.io/projected/73942b68-76d2-4c04-8ffc-6f8bd8783936-kube-api-access-4xxzz\") pod \"auto-csr-approver-29567048-k4s22\" (UID: \"73942b68-76d2-4c04-8ffc-6f8bd8783936\") " pod="openshift-infra/auto-csr-approver-29567048-k4s22" Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.382244 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xxzz\" (UniqueName: \"kubernetes.io/projected/73942b68-76d2-4c04-8ffc-6f8bd8783936-kube-api-access-4xxzz\") pod \"auto-csr-approver-29567048-k4s22\" (UID: \"73942b68-76d2-4c04-8ffc-6f8bd8783936\") " pod="openshift-infra/auto-csr-approver-29567048-k4s22" Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.412696 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xxzz\" (UniqueName: \"kubernetes.io/projected/73942b68-76d2-4c04-8ffc-6f8bd8783936-kube-api-access-4xxzz\") pod \"auto-csr-approver-29567048-k4s22\" (UID: \"73942b68-76d2-4c04-8ffc-6f8bd8783936\") " pod="openshift-infra/auto-csr-approver-29567048-k4s22" Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.468139 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-k4s22" Mar 20 16:08:00 crc kubenswrapper[4764]: I0320 16:08:00.939446 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567048-k4s22"] Mar 20 16:08:01 crc kubenswrapper[4764]: I0320 16:08:01.006471 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-k4s22" event={"ID":"73942b68-76d2-4c04-8ffc-6f8bd8783936","Type":"ContainerStarted","Data":"8387e3cd6dc4ff906ef0094d55b77fbacf83291b76a918b080418b647a81573d"} Mar 20 16:08:03 crc kubenswrapper[4764]: I0320 16:08:03.025317 4764 generic.go:334] "Generic (PLEG): container finished" podID="73942b68-76d2-4c04-8ffc-6f8bd8783936" containerID="f17eb45e9dc4cd35c6c43074d87aa3dfa4c46872fd558f290fbec147a0062530" exitCode=0 Mar 20 16:08:03 crc kubenswrapper[4764]: I0320 16:08:03.025422 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-k4s22" event={"ID":"73942b68-76d2-4c04-8ffc-6f8bd8783936","Type":"ContainerDied","Data":"f17eb45e9dc4cd35c6c43074d87aa3dfa4c46872fd558f290fbec147a0062530"} Mar 20 16:08:04 crc kubenswrapper[4764]: I0320 16:08:04.431941 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-k4s22" Mar 20 16:08:04 crc kubenswrapper[4764]: I0320 16:08:04.555303 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xxzz\" (UniqueName: \"kubernetes.io/projected/73942b68-76d2-4c04-8ffc-6f8bd8783936-kube-api-access-4xxzz\") pod \"73942b68-76d2-4c04-8ffc-6f8bd8783936\" (UID: \"73942b68-76d2-4c04-8ffc-6f8bd8783936\") " Mar 20 16:08:04 crc kubenswrapper[4764]: I0320 16:08:04.884013 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73942b68-76d2-4c04-8ffc-6f8bd8783936-kube-api-access-4xxzz" (OuterVolumeSpecName: "kube-api-access-4xxzz") pod "73942b68-76d2-4c04-8ffc-6f8bd8783936" (UID: "73942b68-76d2-4c04-8ffc-6f8bd8783936"). InnerVolumeSpecName "kube-api-access-4xxzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 16:08:04 crc kubenswrapper[4764]: I0320 16:08:04.964648 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xxzz\" (UniqueName: \"kubernetes.io/projected/73942b68-76d2-4c04-8ffc-6f8bd8783936-kube-api-access-4xxzz\") on node \"crc\" DevicePath \"\"" Mar 20 16:08:05 crc kubenswrapper[4764]: I0320 16:08:05.048547 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567048-k4s22" event={"ID":"73942b68-76d2-4c04-8ffc-6f8bd8783936","Type":"ContainerDied","Data":"8387e3cd6dc4ff906ef0094d55b77fbacf83291b76a918b080418b647a81573d"} Mar 20 16:08:05 crc kubenswrapper[4764]: I0320 16:08:05.048622 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567048-k4s22" Mar 20 16:08:05 crc kubenswrapper[4764]: I0320 16:08:05.048630 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8387e3cd6dc4ff906ef0094d55b77fbacf83291b76a918b080418b647a81573d" Mar 20 16:08:05 crc kubenswrapper[4764]: I0320 16:08:05.516961 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567042-prmhk"] Mar 20 16:08:05 crc kubenswrapper[4764]: I0320 16:08:05.524443 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567042-prmhk"] Mar 20 16:08:07 crc kubenswrapper[4764]: I0320 16:08:07.138257 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f66340-cdef-4b02-889d-c43083631d74" path="/var/lib/kubelet/pods/81f66340-cdef-4b02-889d-c43083631d74/volumes" Mar 20 16:08:10 crc kubenswrapper[4764]: I0320 16:08:10.127328 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:08:10 crc kubenswrapper[4764]: E0320 16:08:10.128008 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:08:12 crc kubenswrapper[4764]: I0320 16:08:12.481485 4764 scope.go:117] "RemoveContainer" containerID="055e5af2b4ba3fa6fe7f6b01b179ad558c895321650653610d6c3a2f0e1cb544" Mar 20 16:08:21 crc kubenswrapper[4764]: I0320 16:08:21.126635 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:08:21 crc kubenswrapper[4764]: E0320 16:08:21.127814 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5" Mar 20 16:08:35 crc kubenswrapper[4764]: I0320 16:08:35.126744 4764 scope.go:117] "RemoveContainer" containerID="c80bb9d25e48b588b1a1ecef4828bb9f1fd62352e65cbbc66ddd6d12be67b718" Mar 20 16:08:35 crc kubenswrapper[4764]: E0320 16:08:35.127849 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wln5_openshift-machine-config-operator(cf5cd911-963e-480f-8bc2-6be581e6d9e5)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wln5" podUID="cf5cd911-963e-480f-8bc2-6be581e6d9e5"